Serverless in AWS: Lambda Functions That Generate and Upload Files to S3


AWS Lambda is a serverless, event-driven solution for compute that abstracts away the need for us, the developer, to worry about any infrastructure, while S3 is a general purpose storage solution. In other words, we can use a simple script (a “lambda” function) to generate data and store it in AWS S3 (roughly like a Dropbox or Google Drive). The benefits are obvious — the developer gets to focus on code, without scalability, server management or reliability concerns.
To demonstrate this, we’ll rely on an example of generating a simple txt file, but the data we could generate in lambda, or store in s3, could be anything — csv, png, etc. Additionally, since lambda’s /tmp
storage is limited to 512MB and gets cleared between invocations, this provides further motivation to utilize s3 as we can save data for as long as desired, as well as store much larger file sizes when necessary.
Part 1: Create s3 bucket, attach policy
Go to your AWS Management Console, and follow these steps:
1 — Choose your desired AWS region, typically the region closest to you, in the top right corner (eg aws-west-2
, etc) , and use this same region in all the following parts of this post— this avoids cross-region latency or access issues.
2 — Then go to S3 (search for it) → create bucket
General purpose is fine
Name must be unique globally
ACLs disabled
Block all public access enabled
Versioning: Optional, but useful if you want to keep multiple versions of the same data
Encryption: Optional, but enable SSE-S3 or SSE-KMS for added security
Tags/logging: Add if you’re tracking cost/auditing
3 — Attach a policy to your Lambda function to allow uploads (PutObject
) from lambda to s3:
Go to IAM (search for it) → Roles → click on your Lambda’s execution role (eg
your-lambda-name-etc...
)Permissions tab → Add permissions → create inline policy
Go to the JSON tab, and paste the following (replace
your-bucket-name
):{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:PutObject" ], "Resource": "arn:aws:s3:::your-bucket-name/*" } ] }
- Click Next, then name it something (e.g.
PutFileToS3Policy
) and click Create policy.
Notes:
You can also include the
s3:GetObject
permission in your policy if you later want to allow the lambda to download any files from s3. Simply add the permission under theAction
key like so:{ // ... "Action": [ "s3:PutObject", "s3:GetObject" ], // ... }
- Always grant only the minimum necessary permissions to the Lambda function’s execution role to ensure security.
After that, your Lambda will be authorized to upload any files to s3://your-bucket-name/
.
Part 2: Create lambda function
In the same AWS region as used in prior steps, create the lambda function by following the below:
1 — Go to Lambda in management console (search for it) -> create function
2 — Choose the following:
Author From Scratch
Name what you want, eg
firstLambda
Compatible architecture: x86_64 is fine (arm64 is cheaper, but may not be as widely compatible with your lambda’s dependencies)
Compatible runtimes: Node.js 22.x (latest version, or whatever runtime you need. We are using a JavaScript/Nodejs setup in this post)
Click Create function
3— Under the “code” tab on the following page, replace the boilerplate code in the code source editor with the following — the code will create a txt file with “hello, world” as its content, and upload the txt file to the s3 bucket that was created in previous steps. Dont test/run it yet until part 3 below is complete:
import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';
const s3Client = new S3Client({ region: 'us-west-2' }); // Adjust region as needed
const BUCKET_NAME = 'your-actual-bucket-name'; // adjust name of bucket
export const handler = async (event) => {
try {
await uploadToS3('Hello, world', BUCKET_NAME, 'test.txt');
return {
statusCode: 200,
body: JSON.stringify('File uploaded successfully'),
};
} catch (error) {
console.error(error);
return {
statusCode: 500,
body: JSON.stringify(error.message),
};
}
};
async function uploadToS3(data, bucketName, fileName) {
const command = new PutObjectCommand({
Bucket: bucketName,
Key: fileName,
Body: data,
ContentType: 'text/plain'
});
try {
const result = await s3Client.send(command);
const s3Url = `https://${bucketName}.s3.amazonaws.com/${fileName}`;
console.log('Uploaded successfully:', s3Url);
return s3Url;
} catch (error) {
console.error('Error uploading to S3:', error);
throw error;
}
}
Part 3: Install packages
Finally, to run and be able to connect to s3 via the code source editor, there are some additional steps you’ll need to complete as packages / libraries do not come pre-installed. An easy solution is to use AWS Layers in lambda so that we can continue running and testing code in the code source editor:
1 — On your local machine, create a folder structure and file like the following. This is simply to generate a zip file with the sdk we’ll need in order to connect to s3 (@aws-sdk/client-s3
). We’ll upload this .zip file to AWS Layers in the following step:
mkdir nodejs
cd nodejs
npm init -y
npm install @aws-sdk/client-s3
cd ..
zip -r aws-sdk-layer.zip nodejs
2 — In AWS Console: Go to Lambda -> Layers. Click Create layer.
Name:
aws-sdk-v3-layer
Click choose file to upload your
aws-sdk-layer.zip
Compatible architecture: x86_64 (or whatever architecture your lambda is using)
Compatible runtimes: Node.js 22.x (latest version, or whatever runtime you need)
Click Create
3 — Attach the layer to your Lambda:
Go to your Lambda function.
Scroll down to Layers section -> Add a layer.
Choose Custom layers -> select the one you just created.
Choose the version (if necessary)
Click Add.
You should now be able to successfully execute your lambda in code source. Hit the test button — if you need to create a new test event, do so (name it anything, keep default settings, press save. Hit Invoke, or test again), and a simple txt file should have been created in your s3 bucket.
Congrats!
Subscribe to my newsletter
Read articles from Xavier Reed directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
