Guide to Managing Amazon S3 Storage Buckets

Table of contents
S3 : Amazon Simple Service
Is an object storage service that offers industry-leading scalability, data availability, security and performance.
Customer of all sizes and industries can use Amazon S3 to store and protect any amount of data for range of use cases. such as data lakes, websites, mobile applications. backup and restore, archive, enterprise applications, IOT devices, and big data analytics.
Amazon S3 is one of the main building blocks of AWS.
It’s advertised as “Infinity scaling“ storage.
Many websites use Amazon S3 as a backbone.
Many AWS services use amazon S3 as an integration as well.
Amazon S3 use cases :
Backup and storage
Disaster Recovery
Archive
Hybrid cloud storage
Application Hosting
Data lakes and big data analytics
Software delivery
Static Website
Amazon S3 Bucket
Amazon S3 allows people to store objects (files) in “buckets“
Buckets must have a globally unique name.
Buckets are defined at the region level.
S3 looks like a global service but buckets are created in a region.
Storage classes :
Amazon S3 offers a range of storage classes designed for different use cases . For example, you can store mission-critical production data in S3 Express one zone for frequent access, save cost by storing infrequently accessed data is S3 standard-IA or S3 one zone -IA and archive data at the lowest costs in S3 Glacier Instant Retrieval, as Glacier Flexible Retrieval and S3 Deep Archive.
How to create S3 bucket ?
Log in to the AWS Console.
Go to S3 under the Services menu.
Click Create bucket.
Select AWS region.
Bucket type : 1) General purpose 2) Directory Select as per your requirement.
Enter a unique bucket name (must be globally unique).
Object ownership : ACL disable.
Bucket versioning : Enable versioning if needed.
Default encryption -Server-side encryption with amazon S3 managed keys (SSE-S3 ).
Bucket key - enable.
Click Create bucket.
Uploading files or folders to an S3 bucket
Upload a File
Go to the S3 service.
Click your bucket name.
Click “Upload” > “Add files”.
Select the file from your computer.
Click “Upload”.
Upload a Folder
Click “Upload” > “Add folder”.
Choose a folder from your system.
Click “Upload”.
How to allow object publicly accessible and why ?
Making an S3 object publicly accessible means anyone on the internet can view or download it.
Why Allow Public Access ?
Hosting a static website.
Serving public images, videos, or documents.
Making downloadable files available to users.
How to Make an S3 Object Public
Go to the S3 Console.
Click on your bucket name.
Go to Permissions.
Block public access (bucket settings).
Remove Block all public access.
Save changes.
Object ownership Edit.
ACLs enable - I acknowledge that ACLs will be restored .
Save changes.
After that
- Select Folder - Action - Make public using ACL - Make public.
How to Host a static Website on AWS S3.
Create a bucket.
Upload file and folder.
Enable static website hosting
In the bucket, go to Properties
Scroll to “Static website hosting”
Click Edit.
Choose "Enable"
Hosting a static website upload your Index document:
index.html
, Error document:error.html
Save changes.
Make bucket publicly accessible
Go to the S3 Console.
Click on your bucket name.
Go to Permissions.
Block public access (bucket settings).
Remove Block all public access.
Save changes.
Object ownership Edit.
ACLs enable - I acknowledge that ACLs will be restored .
Save changes.
After that
- Objects - select all - action - make public using ACL
When you enable static website hosting on an S3 bucket, AWS provides a public HTTP endpoint like this :
Example :
http://<bucket-name>.s3-website-<region>.amazonaws.com/<path-to-file>
Bucket versioning :
Is a means of keeping multiple variants of an object in the same bucket.
How to Enable Versioning :
Click on your bucket name.
Go to the Properties.
Scroll to Bucket Versioning.
Click “Enable” and save.
Once enabled, versioning cannot be disabled you can only suspend it.
Managing the lifecycle of object
S3 Lifecycle helps you to store objects cost effectively throughout-their lifecycle by transitioning them to lower cost storage classes, or deleting expired objects on your behalf. To manage the lifecycle of your objects, create an S3 lifecycle configuration is a set of rules that define action that amazon S3 applies to a group of objects .
There are two types of actions :
Transition action
Expiration action
Lifecycle rules in S3 let you automatically manage objects based on age or status.
How to create Lifecycle rule
Go to the S3 Console
Open your bucket
Click on the “Management” tab
Choose “Create lifecycle rule”.
Lifecycle rule configuration :
Lifecycle rule name - Give it a name.
choose a rule scope - Apply to all object in the bucket - I acknowledge that this rule will apply to all object in the bucket .
Select storage class as per your requirement.
Create rule.
Replication object within and across Regions.
You can use replications to enable automatic, asynchronous copying of objects across Amazon S3 buckets. Buckets that are configured for object replication can be owned by the same AWS account or by different accounts. You can replicate objects to a single destination bucket or to multiple destination buckets. The destination buckets can be in different AWS regions or within the same region as the source bucket.
There are two types of replication
Live replication - To automatically replicate new and update objects.
On-demand - replication to replicate existing objects.
Clone with Amazon S3 bucket
Create a two bucket and on both source and destination side we should enable versioning.
Create replication rule .
- Management - replication rule - create replication rule.
Replication rule configuration
Click "Create replication rule".
Name your rule - status - enable.
Choose your role scope - Apply to all objects.
Destination - 1) Choose a bucket in this account, 2) Specify a bucket in another account Select any one.
Select bucket.
Create IAM rule - Create new role
Destination storage class - Select storage classes and days as per your needs.
Save.
Replication existing objects - Yes, replicate existing object - submit.
Create batch operation job
Job selection.
Automatically run the job when its ready.
Completion report.
Generate completion report.
Permission.
Create new role.
Conclusion :
Amazon S3 is a versatile and robust object storage service that caters to a wide range of use cases, from data lakes and big data analytics to hosting static websites and disaster recovery. Its scalability, security, and performance make it a cornerstone of AWS infrastructure, supporting businesses of all sizes. With features like versioning, lifecycle management, and cross-region replication, Amazon S3 provides users with the tools needed to efficiently manage and protect their data. Whether you're looking to store mission-critical data or archive infrequently accessed information, Amazon S3 offers a variety of storage classes to meet your needs, ensuring cost-effectiveness and reliability.
Subscribe to my newsletter
Read articles from Tambadkar Rohit Yashwant directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Tambadkar Rohit Yashwant
Tambadkar Rohit Yashwant
Learning DevOps Engineer passionate about cloud computing, containerization, and automation. Currently exploring Docker, AWS, and CI/CD pipelines to build scalable and efficient workflows. Documenting my learning journey in blog. stay tuned with me for learning.