File Upload API using AWS-Lambda


Serverless File Upload API on AWS Using Lambda, API Gateway, and S3 (Python)
Build and deploy a secure, serverless file upload API using Python, AWS Lambda, API Gateway, and S3 — with a simple frontend and no servers to manage.
🧭 Introduction
For my third DevOps/cloud project, I decided to go fully serverless. In this project, I built a secure file upload API using AWS services — without managing any servers. The architecture uses:
Lambda (Python) to process uploads
API Gateway to expose a public REST endpoint
S3 to store uploaded files
A minimal HTML frontend for file selection and upload
🔗 GitHub Repo: Serverless_File-Upload_API
🎯 Objective
To create a serverless solution where:
Users upload files via browser
Files are sent to a Lambda function via REST API
Lambda stores the file in S3 bucket
Bonus goals:
Use AWS SAM for infrastructure as code
Keep frontend and backend loosely coupled
🧱 Architecture Diagram
[HTML Upload Form]
|
v
[API Gateway Endpoint] ---> [Lambda Function] ---> [S3 Bucket]
🖥️ Frontend – index.html
A simple HTML + JS form sends the file to API Gateway.
<input type="file" id="fileInput" />
<button onclick="uploadFile()">Upload</button>
<script>
async function uploadFile() {
const file = document.getElementById('fileInput').files[0];
const response = await fetch("YOUR_API_GATEWAY_ENDPOINT", {
method: "POST",
headers: { "X-Filename": file.name },
body: file
});
console.log(await response.json());
}
</script>
📸 Screenshot: Upload form in browser
🐍 Lambda – upload_handler.py
Lambda reads the incoming file, then uploads it to an S3 bucket using boto3
.
import json, boto3, base64, os
s3 = boto3.client('s3')
BUCKET_NAME = os.environ.get('UPLOAD_BUCKET')
def lambda_handler(event, context):
body = base64.b64decode(event['body'])
filename = event['headers'].get('X-Filename', 'uploaded_file')
s3.put_object(Bucket=BUCKET_NAME, Key=filename, Body=body)
return {
'statusCode': 200,
'body': json.dumps({'message': 'Upload successful', 'file': filename})
}
📸 Screenshot: CloudWatch logs showing successful upload
📦 Infrastructure – template.yaml
Using AWS SAM to define the Lambda function, S3 bucket, and API Gateway trigger.
Resources:
UploadFunction:
Type: AWS::Serverless::Function
Properties:
Handler: upload_handler.lambda_handler
Runtime: python3.9
CodeUri: lambda/
Environment:
Variables:
UPLOAD_BUCKET: !Ref UploadBucket
Events:
UploadAPI:
Type: Api
Properties:
Path: /upload
Method: post
UploadBucket:
Type: AWS::S3::Bucket
📸 Screenshot: API Gateway and Lambda deployed via SAM
🚀 Deploying the App
Prerequisites:
AWS CLI and AWS SAM CLI
IAM user with Lambda + S3 permissions
Steps:
sam build
sam deploy --guided
Copy the API Gateway endpoint output and use it in your index.html
frontend.
📸 Screenshot: Successful SAM deployment CLI output
🔐 GitHub Repo
📂 github.com/PradeepMahadevaiah/Serverless_File-Upload_API
🧠 Learnings & Next Steps
✅ Learned AWS SAM deployment, Lambda integration with API Gateway and S3
✅ Practiced environment variables and event decoding in Python Lambda
✅ Worked with API Gateway binary mode + file headers
🔄 Improvements coming:
Add validation for file type and size
Store metadata in DynamoDB
Add Cognito auth or signed URLs
🙌 Conclusion
This serverless file uploader is powerful, scalable, and easy to extend. It showcases how DevOps and cloud skills come together — from infrastructure as code to hands-on coding with Python.
Feel free to check out the GitHub repo or leave comments with your feedback or suggestions!
Subscribe to my newsletter
Read articles from pradeep mahadevaiah directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
