How to read a file in S3 and store it in a String using Python and boto3

If you want to get a file from an S3 Bucket and then put it in a Python string, try the examples below.

boto3, the AWS SDK for Python, offers two distinct methods for accessing files or objects in Amazon S3: client method and the resource method.

Option 1 uses the boto3.client('s3') method, while options 2 and 3 use the boto3.resource('s3') method.

All 3 options do the exact same thing so get the one that you feel comfortable with or the one that will fit your use case.


Continue reading How to read a file in S3 and store it in a String using Python and boto3

How to write Python string to a file in S3 Bucket using boto3

To write a file from a Python string directly to an S3 bucket we need to use the boto3 package.

There are 2 ways to write a file in S3 using boto3. The first is via the boto3 client, and the second is via the boto3 resource. Both of these methods will be shown below.

S3 objects and keys

If you are new to AWS S3, you might be confused with some of the terms. So we’ll define some of them here. If you already know what objects and keys are then you can skip this section.

S3 objects are the same as files. When we run the method put_object what it means is that we are putting a file into S3.

S3 keys are the same as the filename with its full path. So if we want to create an object in S3 with the name of filename.txt within the foobar folder then the key is foobar/filename.txt.

Now that we have clarified some of the AWS S3 terms, follow the details below to start writing Python strings directly to objects in S3.

Continue reading How to write Python string to a file in S3 Bucket using boto3

How to generate S3 presigned URL using boto3 and Python

If you want to give your users temporary access to a private S3 file without giving them access to the AWS console, you will need to generate an S3 presigned URL of your target file.

To generate and test the S3 presigned URL, you can try the code below.

Continue reading How to generate S3 presigned URL using boto3 and Python

How to read a JSON file in S3 and store it in a Dictionary using boto3 and Python

If you want to get a JSON file from an S3 Bucket and load it into a Python Dictionary then you can use the example codes below.

There are 4 scenarios for the examples scripts below.

  1. Basic JSON file from S3 to Python Dictionary
  2. With Try/Except block
  3. With datetime, date, and time conversions
  4. Running the code in a Lambda Function

AWS boto3 provides 2 ways to access S3 files, the boto3.client('s3') and boto3.resource('s3'). For each of the example scenarios above, a code will be provided for the two methods.

Related: Writing a Dictionary to JSON file in S3 using boto3 and Python

Since both methods will function the same, you can choose whichever method you like.

Continue reading How to read a JSON file in S3 and store it in a Dictionary using boto3 and Python

How to write a Dictionary to JSON file in S3 Bucket using boto3 and Python

If you want to write a python dictionary to a JSON file in S3 then you can use the code examples below.

There are two code examples doing the same thing below because boto3 provides a client method and a resource method to edit and access AWS S3.

Related: Reading a JSON file in S3 and store it in a Dictionary using boto3 and Python

Writing Python Dictionary to an S3 Object using boto3 Client

import boto3
import json
from datetime import date

data_dict = {
    'Name': 'Daikon Retek',
    'Birthdate': date(2000, 4, 7),
    'Subjects': ['Math', 'Science', 'History']
}

# Convert Dictionary to JSON String
data_string = json.dumps(data_dict, indent=2, default=str)


# Upload JSON String to an S3 Object
client = boto3.client('s3')

client.put_object(
    Bucket='radishlogic-bucket', 
    Key='s3_folder/client_data.json',
    Body=data_string
)
Continue reading How to write a Dictionary to JSON file in S3 Bucket using boto3 and Python

How to download files from S3 Bucket using boto3 and Python

If you want to download a file from an AWS S3 Bucket using Python, then you can use the sample codes below.

The codes below use AWS SDK for Python named boto3.

boto3 provides three methods to download a file.

  1. download_file()
  2. download_fileobj() – with multipart upload
  3. get_object()

Then for each method, you can use the client class or the resource class of boto3.

Both of the classes will be used for each of the methods above.

Note: All examples will work with any Python3 environment running in Windows, MacOS or Linux operating systems.

Continue reading How to download files from S3 Bucket using boto3 and Python

How to upload a file to S3 Bucket using boto3 and Python

There are 3 ways to upload or copy a file from your local computer to an Amazon Web Services (AWS) S3 Bucket using boto3. All of these will be discussed in this post including multipart uploads.

The codes below will work if you are Windows, Mac, and Linux. Will also work if you working in a Lambda function using Python.

Each method will have an example using boto3 S3 client and S3 resource so you can use whatever method you are comfortable with.


Continue reading How to upload a file to S3 Bucket using boto3 and Python

Convert boto3 AMI Creation Date from string to Python datetime

When retrieving the AMI Creation Date from boto3 it returns a string data type. Visually, this is okay but it is challenging to do operations and comparisons to the AMI Creation Date in this format.

To solve the issue we need to convert the AMI Creation Date from type string to datetime before we could do some operations.

The AMI Creation Date string looks like 2019-09-18T07:34:34.000Z. To convert this we need to use the strptime function from the datetime.datetime library.

Continue reading Convert boto3 AMI Creation Date from string to Python datetime

EC2 with IAM Role: CloudFormation Sample Template

Creating an EC2 Instance with an IAM Role is easy when you do it via the AWS Console but doing this with CloudFormation is not as direct. You will need an Instance Profile to connect an EC2 with an IAM Role.

TL;DR: See the CloudFormation Template below.

Continue reading EC2 with IAM Role: CloudFormation Sample Template

How to get the ARN of an S3 Bucket

Each resource in AWS has an Amazon Resource Name (ARN). An ARN is a unique identifier of your resource. Its value has no duplicate in other accounts and only exists in your account.

It’s used especially in IAM policies where you set which resources you will allow access to.

You can actually predict the ARN of an S3 Bucket since it has a standard format of arn:aws:s3:::S3_BUCKET_NAME.

But if you are like me who is afraid of making a mistake typing the S3 bucket ARN, I prefer going to the AWS Console, searching for the S3 Bucket ARN, and copy-pasting it.

Follow the instructions below to get the S3 Bucket ARN.

Continue reading How to get the ARN of an S3 Bucket