Managing Objects on S3 with Client-Side Encryption and Automation

Abishek GautamAbishek Gautam
3 min read

Scenario:

suppose you have a requirement that your client will upload the Document on a certain Directory and wants their file to be encrypted using client-side Encryption and uploaded every midnight. They want to upload the file only if the file is modified else, skip the encryption and upload.

Architecture Diagram

To complete this Project we will be using AWS CLI and Bash-Script for Automation and Cronjob for triggering event every-midnight.

Procedure:

Step 1: Create a bucket and enable versioning

aws s3 mb s3://max-demo-abi7

#enable the versioning of bucket
aws s3api put-bucket-versioning --bucket max-demo-abi7 --versioning-configuration Status=Enabled
  • Go to the AWS management console and Search for S3 and also check for versioning enabled by clicking the Properties option.

Step2: Create a file script.sh

This script does the following:

  1. Watches a file (file1.txt) for changes.

    If modified

    Encrypts it with GPG.

  2. Uploads the encrypted file to an S3 bucket.

  3. Updates a timestamp log.

  4. If not modified: Skips processing.

  5. If the file doesn't exist: Logs that info.

Now, update the variables in the script, also update passphrase

#!/bin/bash

# Set DIRECTORY_PATH ,FILE_NAME and S3_BUCKET variables
DIRECTORY_PATH="/home/abi/Desktop/max-demo/"
cd "$DIRECTORY_PATH" || { echo "Failed to change directory to $DIRECTORY_PATH"; exit 1; }

FILE_NAME="file1.txt"
ENCRYPTED_FILE="${FILE_NAME}.gpg"
S3_BUCKET="s3://max-demo-abi7"
LAST_MODIFIED_FILE="$DIRECTORY_PATH/last_modified.txt"

# Check if the file exists
if [ -f "$FILE_NAME" ]; then
    # Get the current last modified time of the file
    CURRENT_MODIFIED_TIME=$(stat -c %Y "$FILE_NAME")

    # Check last_modified.txt for previous timestamp (or empty)
    if [ -f "$LAST_MODIFIED_FILE" ]; then
        LAST_MODIFIED_TIME=$(cat "$LAST_MODIFIED_FILE")
        # If the file is empty, treat as not previously uploaded
        if [ -z "$LAST_MODIFIED_TIME" ]; then
            LAST_MODIFIED_TIME=0
        fi
    else
        LAST_MODIFIED_TIME=0  # If the file doesn't exist
    fi

    # Compare timestamps to decide if the file was modified
    if [ "$CURRENT_MODIFIED_TIME" -gt "$LAST_MODIFIED_TIME" ]; then
        echo "$(date): File $FILE_NAME has been modified or is new. Encrypting..."

        # Encrypt the file using GPG (symmetric encryption)
        gpg --batch --yes --symmetric --cipher-algo AES256 --passphrase "max@Passw0rd" -o "$ENCRYPTED_FILE" "$FILE_NAME"

        if [ $? -eq 0 ]; then
            echo "$(date): Encryption successful. Uploading to S3..."

            # Upload the encrypted file to S3
            aws s3 cp "$ENCRYPTED_FILE" "$S3_BUCKET/"

            if [ $? -eq 0 ]; then
                echo "$(date): File successfully uploaded to $S3_BUCKET"
                rm -f "$ENCRYPTED_FILE"

                # Update the last modified timestamp
                echo "$CURRENT_MODIFIED_TIME" > "$LAST_MODIFIED_FILE"
            else
                echo "$(date): Upload to S3 failed!"
            fi
        else
            echo "$(date): Encryption failed!"
        fi
    else
        echo "$(date): File $FILE_NAME has not been modified. Skipping encryption and upload."
    fi
else
    echo "$(date): File $FILE_NAME not found in $DIRECTORY_PATH"
fi

Step3: Make the script.sh executable

sudo chmod +x script.sh

Step4: Set the cronjob for automation at 12:00 AM midnight

#checking the status of cron
systemctl status cron

crontab -e
#paste the below line and change path to script.sh file
00 00 * * * <Path/to/scritp>/script.sh >> <Path/to/log>/logfile.log 2>&1

for testing set the cronjob for min after some minutes to current time.

Example: This will run the cronjob at 1:40 PM everyday

40 13 * * * ~/Desktop/max-demo/script.sh >> ~/Desktop/max-demo/logfile.log 2>&1

  • After the Successful execution of the script , you will get your encrypted object with extension .gpg uploaded to bucket

Step5: Reading the File

  • Download the file and decrypt using the passphrase that you have set before.
#download the file from s3 to local directory
aws s3 cp s3://max-demo-abi7/file1.txt.gpg .

cat file1.txt.gpg
#decrypting the file using passphrase and output in file1.txt file
gpg --batch --yes --passphrase "max@Passw0rd" -o file1.txt -d file1.txt.gpg

#view the file after decryption
cat file1.txt
  • View the encrypted text file using cat command

  • You can also check the log_files.log and last_modified.txt files by setting cronjob without changing the context of file1.txt.

Congratulations!!!

You have successfully completed this demonstration. Now you will be able to use client-side encryption into your S3-bucket.

0
Subscribe to my newsletter

Read articles from Abishek Gautam directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Abishek Gautam
Abishek Gautam

I'm AWS Certified Solution Architect. I do write about the AWS, Linux, Security and Automations.