There are 3 ways to upload or copy a file from your local computer to an Amazon Web Services (AWS) S3 Bucket using boto3. All of these will be discussed in this post including multipart uploads.
The codes below will work if you are Windows, Mac, and Linux. Will also work if you working in a Lambda function using Python.
Each method will have an example using boto3 S3 client and S3 resource so you can use whatever method you are comfortable with.
- upload_file method
- upload_fileobj method (supports multipart upload)
- put_object method
upload_file Method
The most straightforward way to copy a file from your local machine to an S3 Bucket is to use the upload_file function of boto3.
In the examples below, we are going to upload the local file named file_small.txt
located inside local_folder
.
The target S3 Bucket is named radishlogic-bucket
and the target S3 object should be uploaded inside the s3_folder
with the filename of file_small.txt
. This will result in the S3 object key of s3_folder/file_small.txt
.
upload_file S3 Client
import boto3
s3_client = boto3.client('s3')
s3_client.upload_file(
Filename='local_folder/file_small.txt',
Bucket='radishlogic-bucket',
Key='s3_folder/file_small.txt'
)
S3 Client upload_file function documentation can be found here.
upload_file S3 Resource
import boto3
s3_resource = boto3.resource('s3')
s3_bucket = s3_resource.Bucket(name='radishlogic-bucket')
s3_bucket.upload_file(
Filename='local_folder/file_small.txt',
Key='s3_folder/file_small.txt'
)
S3 Resource upload_file method documentation can be found here.
Note: upload_file method does not support multipart upload, therefore this is only suited for small files (less than 100 MB). If your file size is greater than 100MB then consider using upload_fileobj
method for multipart upload support, which will make your upload quicker.
If you are uploading files that are greater than 100MB this will still work, just with a slower upload speed compared to upload_fileobj.
upload_fileobj Method
If you want to upload bigger files (greater than 100 MB) then use the upload_fileobj function since it supports multipart uploads.
The following are the source and destination details.
Local file details
- Filename:
file_big.txt
- Local folder name:
local_folder
AWS S3 destination details
- S3 Bucket:
radishlogic-bucket
- S3 Folder Name:
s3_folder
- S3 Object Name:
file_big.txt
- Resulting S3 Object key:
s3_folder/file_big.txt
upload_fileobj S3 Client
import boto3
LOCAL_FILENAME = 'local_folder/file_big.txt'
s3_client = boto3.client('s3')
with open(LOCAL_FILENAME, 'rb') as data:
s3_client.upload_fileobj(
Fileobj=data,
Bucket='radishlogic-bucket',
Key='s3_folder/file_big.txt'
)
S3 Client upload_fileobj method reference can be found here.
upload_fileobj S3 Resource
import boto3
LOCAL_FILENAME = 'local_folder/file_big.txt'
s3_resource = boto3.resource('s3')
s3_bucket = s3_resource.Bucket(name='radishlogic-bucket')
with open(LOCAL_FILENAME, 'rb') as data:
s3_bucket.upload_fileobj(
Fileobj=data,
Key='s3_folder/file_big.txt'
)
S3 Resource upload_fileobj method reference can be found here.
If you noticed that upload_fileobj
has more lines is because the function requires a file-like object to be in binary mode as the Fileobj
parameter input. That is why we had to use the open()
built-in function of Python using the rb
parameter (r
is read mode, b
is binary mode).
Since I was curious, I also tested using upload_fileobj to upload the smaller file – file_small.txt
and it still worked. Therefore, if you are not sure how big the file you will upload will be use upload_fileobj, and it will automatically select if it will use multipart upload or not.
put_object Method
Another method is to use the put_object function of boto3 S3. It is written similarly to upload_fileobj, the only downside is that it does not support multipart upload.
If you’re going to use this to upload a local file to an AWS S3 Bucket, then I suggest just using the upload_file function since it’s similar to how it uploads your file to S3 but with fewer lines of code.
Below are the examples for using put_object method of boto3 S3.
put_object S3 Client
import boto3
LOCAL_FILENAME = 'local_folder/file_small.txt'
s3_client = boto3.client('s3')
with open(LOCAL_FILENAME, 'rb') as data:
s3_client.put_object(
Bucket='radishlogic-bucket',
Key='s3_folder/file_small.txt',
Body=data
)
S3 Client put_object function documentation can be found here.
put_object S3 Resource
import boto3
LOCAL_FILENAME = 'local_folder/file_small.txt'
s3_resource = boto3.resource('s3')
s3_bucket = s3_resource.Bucket(name='radishlogic-bucket')
with open(LOCAL_FILENAME, 'rb') as data:
s3_bucket.put_object(
Key='s3_folder/file_small.txt',
Body=data
)
S3 Resource put_object function reference can be found here.
I hope this post helped you with the different methods to upload or copy a local file to an AWS S3 Bucket.
Let me know your experience in the comments below.