Automation S3 Bucket with Versioning Cleanup

BalajiBalaji
4 min read

Managing S3 buckets with versioning enabled can become a daunting task, especially when dealing with a large number of objects and versions. In this blog post, we'll walk through the process of creating an S3 bucket with versioning, uploading some objects, and using a script to automate the deletion of all objects, versions, and delete markers in the bucket. This script is particularly useful for managing and cleaning up large numbers of objects efficiently.

Step 1: Create an S3 Bucket with Versioning

First, let's create an S3 bucket with versioning enabled using AWS CLI

COPY

#!/bin/bash

BUCKET_NAME="your-unique-bucket-name"

# Create the S3 bucket
aws s3api create-bucket --bucket $BUCKET_NAME --region us-east-1

# Enable versioning on the bucket
aws s3api put-bucket-versioning --bucket $BUCKET_NAME --versioning-configuration Status=Enabled

This script creates an S3 bucket and enables versioning, allowing us to keep track of different versions of the objects we upload.

Step 2: Upload Objects to the Bucket

Next, let's upload some objects to our newly created bucket. you can upload object using aws console

COPY

#!/bin/bash

BUCKET_NAME="your-unique-bucket-name"

# Upload some objects
aws s3 cp file1.txt s3://$BUCKET_NAME/file1.txt
aws s3 cp file2.txt s3://$BUCKET_NAME/file2.txt

Step 3: Delete Objects, Versions, and Delete Markers

The following script automates the deletion of all objects, versions, and delete markers in the bucket.

COPY

#!/bin/bash

BUCKET_NAME="your-unique-bucket-name"

# List all versions and delete markers
aws s3api list-object-versions --bucket $BUCKET_NAME --query "Versions[].[Key, VersionId]" --output text > versions.txt
aws s3api list-object-versions --bucket $BUCKET_NAME --query "DeleteMarkers[].[Key, VersionId]" --output text > delete-markers.txt

# Delete all versions
while IFS=$'\t' read -r key version; do
  if [ -n "$key" ] && [ -n "$version" ]; then
    aws s3api delete-object --bucket "$BUCKET_NAME" --key "$key" --version-id "$version"
  fi
done < versions.txt

# Delete all delete markers
while IFS=$'\t' read -r key version; do
  if [ -n "$key" ] && [ -n "$version" ]; then
    aws s3api delete-object --bucket "$BUCKET_NAME" --key "$key" --version-id "$version"
  fi
done < delete-markers.txt

# Clean up
rm versions.txt delete-markers.txt

# Delete the bucket
aws s3api delete-bucket --bucket "$BUCKET_NAME"

Explanation of the Script

Let's break down what each part of the script does:

  1. Set the Bucket Name

    COPY

      BUCKET_NAME="your-unique-bucket-name"
    

    This line defines a variable BUCKET_NAME with the name of the S3 bucket you want to clean up.

  2. List All Versions and Delete Markers

    COPY

      aws s3api list-object-versions --bucket $BUCKET_NAME --query "Versions[].[Key, VersionId]" --output text > versions.txt
      aws s3api list-object-versions --bucket $BUCKET_NAME --query "DeleteMarkers[].[Key, VersionId]" --output text > delete-markers.txt
    

    These commands list all the versions and delete markers in the bucket and save the output to two text files: versions.txt and delete-markers.txt. The --query parameter is used to extract the keys and version IDs, and the --output text option formats the output as plain text.

  3. Delete All Versions

    COPY

      while IFS=$'\t' read -r key version; do
        if [ -n "$key" ] && [ -n "$version" ]; then
          aws s3api delete-object --bucket "$BUCKET_NAME" --key "$key" --version-id "$version"
        fi
      done < versions.txt
    

    This loop reads each line from versions.txt, extracting the key and version ID. If both values are present, it uses the aws s3api delete-object command to delete the specific version of the object.

  4. Delete All Delete Markers

    COPY

      while IFS=$'\t' read -r key version; do
        if [ -n "$key" ] && [ -n "$version" ]; then
          aws s3api delete-object --bucket "$BUCKET_NAME" --key "$key" --version-id "$version"
        fi
      done < delete-markers.txt
    

    Similar to the previous loop, this loop reads from delete-markers.txt and deletes each delete marker found.

  5. Clean Up

    COPY

      rm versions.txt delete-markers.txt
    

    This command removes the temporary files versions.txt and delete-markers.txt to clean up after the script runs.

  6. Delete the Bucket

    COPY

      aws s3api delete-bucket --bucket "$BUCKET_NAME"
    

    Finally, this command deletes the now-empty S3 bucket.

Purpose and Use Case

Managing S3 buckets with versioning enabled can lead to a large number of object versions and delete markers. This script provides a simple and effective way to clean up these objects, making it ideal for scenarios where you need to manage storage costs or reorganize your bucket structure. By automating the deletion process, you can ensure that your bucket remains organized and free from unnecessary versions and delete markers.

Conclusion

In this blog post, we covered how to create an S3 bucket with versioning, upload objects, and use a script to delete all objects, versions, and delete markers. This script is particularly useful for managing large numbers of objects and maintaining an organized bucket structure.

Feel free to customize and adapt the script to suit your specific needs, and happy cleaning!

0
Subscribe to my newsletter

Read articles from Balaji directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Balaji
Balaji

👋 Hi there! I'm Balaji S, a passionate technologist with a focus on AWS, Linux, DevOps, and Kubernetes. 💼 As an experienced DevOps engineer, I specialize in designing, implementing, and optimizing cloud infrastructure on AWS. I have a deep understanding of various AWS services like EC2, S3, RDS, Lambda, and more, and I leverage my expertise to architect scalable and secure solutions. 🐧 With a strong background in Linux systems administration, I'm well-versed in managing and troubleshooting Linux-based environments. I enjoy working with open-source technologies and have a knack for maximizing performance and stability in Linux systems. ⚙️ DevOps is my passion, and I thrive in bridging the gap between development and operations teams. I automate processes, streamline CI/CD pipelines, and implement robust monitoring and logging solutions to ensure continuous delivery and high availability of applications. ☸️ Kubernetes is a key part of my toolkit, and I have hands-on experience in deploying and managing containerized applications in Kubernetes clusters. I'm skilled in creating Helm charts, optimizing resource utilization, and implementing effective scaling strategies for microservices architectures. 📝 On Hashnode, I share my insights, best practices, and tutorials on topics related to AWS, Linux, DevOps, and Kubernetes. Join me on my journey as we explore the latest trends and advancements in cloud-native technologies. ✨ Let's connect and dive into the world of AWS, Linux, DevOps, and Kubernetes together!