How to stop object file versioning in an S3 Bucket using Python boto3

To stop versioning in an S3 Bucket, we can use the boto3 method put_bucket_versioning() with the VersioningConfiguration parameter to be set to a Status of Suspended.

Below are 3 methods on how we can stop versioning in an S3 Bucket using AWS-SDK for Python, boto3.

The Python scripts below do the same thing: suspend the versioning of the target S3 Bucket named ‘radishlogic-bucket’.

You may use any method that you like depending on which you are comfortable using.

Interestingly, suspending versioning uses the same boto3 method as enabling versioning, put_bucket_versioning(). The only difference is that the ‘Status’ is ‘Suspended’ instead of ‘Enabled’.

Continue reading How to stop object file versioning in an S3 Bucket using Python boto3

How to check if versioning is enabled in an S3 bucket using Python boto3

To check if versioning is enabled in an S3 Bucket using Python boto3, we will need to use the get_bucket_versioning() method of boto3 S3.

Below are 3 ways to code how to get the S3 Bucket versioning status using AWS-SDK for Python, boto3.

The Python scripts below all do the same thing. They check the status of versioning in the target S3 Bucket named radishlogic-bucket.

You can choose whichever method you are comfortable with.

Continue reading How to check if versioning is enabled in an S3 bucket using Python boto3

How to create an S3 Bucket using Python boto3

To create an S3 Bucket in your target AWS Account, you will need to use the create_bucket() method of boto3 S3.

The method requires only the parameter Bucket, which is your target bucket name.

But I highly recommend that you also use the CreateBucketConfiguration parameter to set the region of the S3 Bucket. If you do not set the CreateBucketConfiguration parameter, it will create your S3 Bucket in the N. Virginia region (us-east-1) by default.

Below are two ways to create an S3 Bucket using Python boto3.

Both python scripts does the same thing. They will create an S3 Bucket named radishlogic-bucket in the Singapore region (ap-southeast-1).

You can choose whichever method you are comfortable with.

Continue reading How to create an S3 Bucket using Python boto3

How to list all S3 Buckets using Python boto3

To list the S3 Buckets inside an AWS Account, you will need to use the list_buckets() method of boto3.

Below are two example codes that you can use to retrieve all S3 buckets inside a Amazon Web Services account.

Both example scripts will do that same thing. It will query AWS for all the S3 Buckets inside the account and return the buckets names in a Python list.

Since both will do the same thing, you can use whichever method you prefer.

Continue reading How to list all S3 Buckets using Python boto3

How to list files in an S3 bucket folder using boto3 and Python

If you want to list the files/objects inside a specific folder within an S3 bucket then you will need to use the list_objects_v2 method with the Prefix parameter in boto3.

Below are 3 examples codes on how to list the objects in an S3 bucket folder.

What the code does is that it gets all the files/objects inside the S3 bucket named radishlogic-bucket within the folder named s3_folder/ and adds their keys inside a Python list (s3_object_key_list). It then prints each of the object keys in the list and also prints the number of files in the folder.

Continue reading How to list files in an S3 bucket folder using boto3 and Python

How to list all objects in an S3 Bucket using boto3 and Python

If you need to list all files/objects inside an AWS S3 Bucket then you will need to use the list_objects_v2 method in boto3.

Below are 3 example codes of how to list all files in a target S3 Bucket.

You can use any of the 3 options since it does the same thing.

It will get all of the files inside the S3 Bucket radishlogic-bucket using Python boto3, put it inside a Python list, then print each object key. It will print the files inside folder recursively, regardless if they are inside a folder or not.

At the end, it will also print the number of items inside the S3 Bucket.

Continue reading How to list all objects in an S3 Bucket using boto3 and Python

How to delete a file in AWS S3 using boto3 and Python

To delete a file inside an AWS S3 Bucket using Python then you will need to use the delete_object function of boto3.

Below are 3 examples to delete an S3 file.

You can use any of the 3 options since it does the same thing. It will delete the file in S3 with the key of s3_folder/file.txt inside the S3 bucket named radishlogic-bucket using Python boto3.

Continue reading How to delete a file in AWS S3 using boto3 and Python

How to read a file in S3 and store it in a String using Python and boto3

If you want to get a file from an S3 Bucket and then put it in a Python string, try the examples below.

boto3, the AWS SDK for Python, offers two distinct methods for accessing files or objects in Amazon S3: client method and the resource method.

Option 1 uses the boto3.client('s3') method, while options 2 and 3 use the boto3.resource('s3') method.

All 3 options do the exact same thing so get the one that you feel comfortable with or the one that will fit your use case.


Continue reading How to read a file in S3 and store it in a String using Python and boto3

How to generate S3 presigned URL using boto3 and Python

If you want to give your users temporary access to a private S3 file without giving them access to the AWS console, you will need to generate an S3 presigned URL of your target file.

To generate and test the S3 presigned URL, you can try the code below.

Continue reading How to generate S3 presigned URL using boto3 and Python

How to read a JSON file in S3 and store it in a Dictionary using boto3 and Python

If you want to get a JSON file from an S3 Bucket and load it into a Python Dictionary then you can use the example codes below.

There are 4 scenarios for the examples scripts below.

  1. Basic JSON file from S3 to Python Dictionary
  2. With Try/Except block
  3. With datetime, date, and time conversions
  4. Running the code in a Lambda Function

AWS boto3 provides 2 ways to access S3 files, the boto3.client('s3') and boto3.resource('s3'). For each of the example scenarios above, a code will be provided for the two methods.

Related: Writing a Dictionary to JSON file in S3 using boto3 and Python

Since both methods will function the same, you can choose whichever method you like.

Continue reading How to read a JSON file in S3 and store it in a Dictionary using boto3 and Python