How to write a Dictionary to JSON file in S3 Bucket using boto3 and Python

If you want to write a python dictionary to a JSON file in S3 then you can use the code examples below.

There are two code examples doing the same thing below because boto3 provides a client method and a resource method to edit and access AWS S3.

Related: Reading a JSON file in S3 and store it in a Dictionary using boto3 and Python

Writing Python Dictionary to an S3 Object using boto3 Client

import boto3
import json
from datetime import date

data_dict = {
    'Name': 'Daikon Retek',
    'Birthdate': date(2000, 4, 7),
    'Subjects': ['Math', 'Science', 'History']
}

# Convert Dictionary to JSON String
data_string = json.dumps(data_dict, indent=2, default=str)


# Upload JSON String to an S3 Object
client = boto3.client('s3')

client.put_object(
    Bucket='radishlogic-bucket', 
    Key='s3_folder/client_data.json',
    Body=data_string
)
Continue reading How to write a Dictionary to JSON file in S3 Bucket using boto3 and Python

How to download files from S3 Bucket using boto3 and Python

If you want to download a file from an AWS S3 Bucket using Python, then you can use the sample codes below.

The codes below use AWS SDK for Python named boto3.

boto3 provides three methods to download a file.

  1. download_file()
  2. download_fileobj() – with multipart upload
  3. get_object()

Then for each method, you can use the client class or the resource class of boto3.

Both of the classes will be used for each of the methods above.

Note: All examples will work with any Python3 environment running in Windows, MacOS or Linux operating systems.

Continue reading How to download files from S3 Bucket using boto3 and Python

How to upload a file to S3 Bucket using boto3 and Python

There are 3 ways to upload or copy a file from your local computer to an Amazon Web Services (AWS) S3 Bucket using boto3. All of these will be discussed in this post including multipart uploads.

The codes below will work if you are Windows, Mac, and Linux. Will also work if you working in a Lambda function using Python.

Each method will have an example using boto3 S3 client and S3 resource so you can use whatever method you are comfortable with.


Continue reading How to upload a file to S3 Bucket using boto3 and Python

How to get the ARN of an S3 Bucket

Each resource in AWS has an Amazon Resource Name (ARN). An ARN is a unique identifier of your resource. Its value has no duplicate in other accounts and only exists in your account.

It’s used especially in IAM policies where you set which resources you will allow access to.

You can actually predict the ARN of an S3 Bucket since it has a standard format of arn:aws:s3:::S3_BUCKET_NAME.

But if you are like me who is afraid of making a mistake typing the S3 bucket ARN, I prefer going to the AWS Console, searching for the S3 Bucket ARN, and copy-pasting it.

Follow the instructions below to get the S3 Bucket ARN.

Continue reading How to get the ARN of an S3 Bucket

Minimum IAM Permission to create S3 presigned URLs

If you wanted to publicly share a file or an object inside a private S3 bucket you will need to create an S3 presigned URL. This will create a temporary link to the S3 file which you can share and access publicly.

As best practice, we must apply the least privileged permission to the IAM user or IAM role that will create the S3 presigned URL. Which brings us to the question, what is the minimum IAM permission to create an S3 presigned URL?

Continue reading Minimum IAM Permission to create S3 presigned URLs

How to download all files in an S3 Bucket using AWS CLI

There are many ways to download files from an S3 Bucket, but if you are downloading an entire S3 Bucket then I would recommend using AWS CLI and running the command aws s3 sync s3://SOURCE_BUCKET LOCAL_DESTINATION.

In the examples below, I’m going to download the contents of my S3 Bucket named radishlogic-bucket.

My S3 Bucket in the AWS Console
My S3 Bucket in the AWS management console


Example 1: Download S3 Bucket to Current Local Folder

If you want to download the whole S3 Bucket in the same folder that you are in, then you should use the command aws s3 sync s3://SOURCE_BUCKET ..

In our example S3 Bucket above, the AWS CLI will be like this.

Continue reading How to download all files in an S3 Bucket using AWS CLI

How to create IAM User Access Keys via AWS CLI

To create programmatic Access Keys for an AWS IAM User using AWS CLI, run the command aws iam create-access-key.

On the command below change MyUser with the username of your target IAM User.

aws iam create-access-key --user-name MyUser

This will return the following JSON formatted string.

Continue reading How to create IAM User Access Keys via AWS CLI

How to create IAM User Access Keys using AWS Console

If you want to be able to control your AWS resources on your local computer you will either use AWS CLI or AWS SDK. To use those tools, you will need to have an Access Key ID and a Secret Access Key.

In this post, we will show you how you can generate your own Access Keys so you can programmatically access your AWS resources.

For the instructions later the target username that I want to create Access Keys is rabano. Yours will be different.

Continue reading How to create IAM User Access Keys using AWS Console

How to access the C: Drive in Amazon Workspaces

The C: Drive or root volume in AWS Workspaces cannot be seen if you open File Explorer.

This post will show how you can access the C: Drive when it is not shown.

If you want the C: Drive to be shown permanently then reading my post about it here will help.

Below are three ways you can access the C: Drive.


Access C: Drive with Windows File Explorer

To access C: Drive with Windows File Explorer, go to the address bar and enter C:. This will bring you to the C: Drive.

Continue reading How to access the C: Drive in Amazon Workspaces