๐ Automating AWS S3 Bucket Operations with Python & Boto3

๐ Introduction
Amazon S3 (Simple Storage Service) is a powerful object storage service from AWS. While the AWS Console offers a user-friendly UI, automating S3 operations via Python can save you time and help scale your workflows.
In this post, Iโll walk you through a Python script I created that handles:
Creating and deleting S3 buckets
Uploading, listing, and deleting files
Neatly formatted CLI output for ease of use
Letโs dive in!
๐ ๏ธ Prerequisites
Before you start, make sure you have:
โ
An AWS account
โ
AWS CLI configured (aws configure
)
โ
Python 3 installed
โ
Boto3 installed:
pip install boto3
๐ Project Structure
Hereโs the structure of the project:
project/
โ
โโโ aws_wrapper.py # Contains all S3 operation functions
โโโ aws_caller.py # Entry point to call those functions
โโโ testupl.txt # Sample file to upload
๐ง How It Works
aws_
wrapper.py
This file includes all S3 utility functions.
def create_bucket(s3obj, bucket_name, region_name):
if region_name == 'us-east-1': #us-east-1 is the default region for S3
s3obj.create_bucket(Bucket=bucket_name)
else:
s3obj.create_bucket(Bucket=bucket_name,CreateBucketConfiguration={'LocationConstraint': region_name}) # if you want to create bucket in another regioin then you need to specify the CreateBucketConfiguration like this:
print(f"\nโ
Bucket '{bucket_name}' created successfully in '{region_name}'\n")
def show_s3_buck(s3obj):
print("๐ฆ The Buckets in S3 are as below:")
for bucket in s3obj.buckets.all():
print(f" - {bucket.name}")
print() # Empty line for spacing
def upload_file(s3obj, bucket_name, file_path, key_name):
file_data = open(file_path, 'rb')
s3obj.Bucket(bucket_name).put_object(Key=key_name, Body=file_data)
file_data.close()
print(f"โ
File '{key_name}' uploaded successfully to bucket '{bucket_name}'\n")
def list_file(s3obj, bucket_name):
print(f"๐ The Files in bucket '{bucket_name}' are as below:")
for bucks in s3obj.Bucket(bucket_name).objects.all():
print(f" - {bucks.key}")
print() # Empty line for spacing
def del_file(s3obj, bucket_name, key_name):
s3obj.Object(bucket_name, key_name).delete()
print(f"๐๏ธ File '{key_name}' deleted successfully from bucket '{bucket_name}'\n")
def del_bucket(s3obj, bucket_name):
s3obj.Bucket(bucket_name).delete()
print(f"๐งน Bucket '{bucket_name}' deleted successfully\n")
aws_
caller.py
This file acts as the runner:
import boto3
from aws_wrapper import create_bucket, show_s3_buck, upload_file, list_file, del_file, del_bucket
s3obj = boto3.resource('s3')
file_path = 'testupl.txt'
create_bucket(s3obj, 'your-unique-bucket-name', 'us-east-2')
show_s3_buck(s3obj)
upload_file(s3obj, 'your-unique-bucket-name', file_path, 'testupl.txt')
list_file(s3obj, 'your-unique-bucket-name')
del_file(s3obj, 'your-unique-bucket-name', 'testupl.txt')
del_bucket(s3obj, 'your-unique-bucket-name')
NOTE:- Make sure your bucket name is globally unique and uses only lowercase letters, numbers, and hyphens (-
).
๐ฅ๏ธ Sample Output
โ
Bucket 'py-lab-from-script' created successfully in 'us-east-2'
๐ฆ The Buckets in S3 are as below:
- mehul-backup-data
- py-lab-from-script
โ
File 'testupl.txt' uploaded successfully to bucket 'py-lab-from-script'
๐ The Files in bucket 'py-lab-from-script' are as below:
- testupl.txt
๐๏ธ File 'testupl.txt' deleted successfully from bucket 'py-lab-from-script'
๐งน Bucket 'py-lab-from-script' deleted successfully
๐ค Why This Is Useful
You can automate backup scripts or deployment pipelines.
Itโs a reusable wrapperโuse it for multiple projects.
Great for learning how
boto3
interacts with AWS S3.
Subscribe to my newsletter
Read articles from Mehul Panchal directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
