๐Ÿค– Automating File Upload Alerts with AWS Lambda, S3, SES and CloudWatch

Aubrey T DubeAubrey T Dube
4 min read

๐Ÿง  Introduction

Imagine uploading a file to the cloud and instantly getting a log entry โ€” or even an email notification โ€” confirming the upload.
This project shows exactly how to build that workflow using AWS S3, Lambda, CloudWatch, and SES.

๐ŸŒŠ Flow

When a file is uploaded to an Amazon S3 bucket, that S3 event triggers an AWS Lambda function. The Lambda function will:

  • Log the event (bucket, key, time) to CloudWatch Logs (primary behavior), and

  • Optionally read object metadata (head_object) and optionally send an email using Amazon SES.

๐Ÿงฉ Step-by-step Implementation

1. ๐Ÿ—‘ Create an S3 bucket

Console

  1. AWS Console โ†’ Services โ†’ S3 โ†’ Create bucket.

  2. Create Bucket: <csn-lambda-bucket> (MUST be globally unique).


2. โš™๏ธ Create IAM role for Lambda (with CloudWatch, and optional S3/SES permissions)

We will create a role that trusts Lambda and attach the AWS-managed AWSLambdaBasicExecutionRole for CloudWatch logging. If Lambda must read objects (head_object) or call SES, attach a custom policy with explicit resources.

Trust policy (Lambda)

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": { "Service": "lambda.amazonaws.com" },
      "Action": "sts:AssumeRole"
    }
  ]
}

Console

  1. Console โ†’ IAM โ†’ Roles โ†’ Create role.

  2. Trusted entity: AWS service โ†’ Lambda โ†’ Next.

  3. Attach AWSLambdaBasicExecutionRole.

  4. (Optional) Create and attach a custom policy for S3 read and/or SES send actions (see policy JSONs below).

  5. Name the role: lambda-s3-exec-role.

Add additional IAM policy JSONs to role (Optional)

S3 read (if Lambda will call)


{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "AllowS3ReadForSpecificBucket",
      "Effect": "Allow",
      "Action": ["s3:GetObject","s3:ListBucket","s3:GetObjectAcl"],
      "Resource": [
        "arn:aws:s3:::csn-lambda-bucket",
        "arn:aws:s3:::csn-lambda-bucket/*"
      ]
    }
  ]
}

SES send (optional)

{
  "Version":"2012-10-17",
  "Statement":[
    {
      "Effect":"Allow",
      "Action":["ses:SendEmail","ses:SendRawEmail"],
      "Resource":"*"
    }
  ]
}
Note: SES often requires you to verify the sender identity, 
and SES accounts may be in sandbox mode (recipient must be verified until you request 
production access).

3. ๐Ÿ’ก Create the Lambda function

Create a function with Python runtime.

1. Console โ†’ Lambda โ†’ Create function โ†’ Author from scratch.

2. Function name: s3-upload-trigger.

3. Runtime: Python 3.9 (or latest available).

4. Permissions: choose Use an existing role โ†’ select lambda-s3-exec-role-week and create function.

5. In Configuration: set Timeout 30s, Memory 128MB.

6. Add environment variables (optional): - SEND_EMAIL = false - SES_SOURCE = verified-sender@example.com - SES_DEST = verified-recipient@example.com

7. Paste the Lambda code below into the inline editor and Deploy.

- This function logs the S3 event, optionally reads object metadata, and optionally sends an SES email when environment variable SEND_EMAIL=true.

import json
import os
import logging
import urllib.parse
import boto3

logger = logging.getLogger()
logger.setLevel(logging.INFO)

s3_client = boto3.client('s3')
ses_client = boto3.client('ses')  # only used if SEND_EMAIL = true


def lambda_handler(event, context):
    logger.info("Received event: %s", json.dumps(event))

    records = event.get('Records', [])
    for record in records:
        try:
            s3 = record['s3']
            bucket = s3['bucket']['name']
            key = urllib.parse.unquote_plus(s3['object']['key'])
            logger.info("New object in bucket '%s': key='%s'", bucket, key)

            # Optional: read metadata (head_object)
            try:
                head = s3_client.head_object(Bucket=bucket, Key=key)
                logger.info("Object metadata: ContentLength=%s ContentType=%s",
                            head.get('ContentLength'), head.get('ContentType'))
            except Exception:
                logger.exception("Could not read object head (maybe no permission).")

            # Optional SES email
            send_email = os.environ.get('SEND_EMAIL', 'false').lower() == 'true'
            if send_email:
                source = os.environ.get('SES_SOURCE')
                dest = os.environ.get('SES_DEST')
                if not source or not dest:
                    logger.error("SES_SOURCE or SES_DEST environment variables not set.")
                else:
                    subject = f"New upload: {key}"
                    body = f"A new object was uploaded to {bucket}/{key}."
                    try:
                        ses_client.send_email(
                            Source=source,
                            Destination={'ToAddresses': [dest]},
                            Message={
                                'Subject': {'Data': subject},
                                'Body': {'Text': {'Data': body}}


4. ๐Ÿ”ซ Add S3 trigger (two ways)

Method A โ€” From Lambda console (recommended)

1. Open your Lambda function โ†’ **Add trigger**.

2. Choose S3.

3. Bucket: csn-lambda-bucket...

4. Event type: All object create events (or PUT only).

5. Add. Lambda console will auto-add permission so S3 can invoke Lambda.

Method B โ€” From S3

1. S3 โ†’ select bucket โ†’ Properties โ†’ Event notifications (or Events tab).

2. **Create event notification**: select events PUT or All object create events, Destination: Lambda function, and pick the function.


5. ๐Ÿงช Test & verify

1. Upload a file to your S3 bucket , week-10.png

2. Open CloudWatch Logs โ†’ Log groups โ†’ `/aws/lambda/s3-upload-trigger` โ†’ open the latest log stream.

- Look for messages like: New object in bucket 'csn-lambda-bucket': key='week-10.png'

- Also check email sent log on top.

๐Ÿง  Conclusion

This project showcases the power of AWSโ€™s serverless ecosystem. By connecting S3, Lambda, and SES, weโ€™ve built a system that reacts instantly to file uploads without needing any servers to manage.


Thanks for reading ๐Ÿ™Œ

Feel free to connect or drop feedback on Aubrey T Dube LinkedIn or GitHub

0
Subscribe to my newsletter

Read articles from Aubrey T Dube directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Aubrey T Dube
Aubrey T Dube

Welcome to the GreyStack by Aubrey T Dube - A blog where software engineering, cloud data engineering and AI intersect.