Consuming DynamoDB Streams with .NET Lambda


DynamoDB Streams provide a powerful way to capture changes in your DynamoDB tables and react to them in real-time. In this post, we'll walk through how to consume these streams using a .NET AWS Lambda function.
Prerequisites
Before you begin, make sure you have the following:
.NET SDK (>= 6.0) installed: https://dotnet.microsoft.com/download
AWS CLI configured with credentials and default region: Installing the AWS CLI
Amazon.Lambda.Templates installed:
dotnet new install Amazon.Lambda.Templates
An existing DynamoDB table or permissions to create one
Basic familiarity with AWS Lambda and DynamoDB
(Optional) AWS Toolkit for Visual Studio if you prefer a GUI deployment experience
What Are DynamoDB Streams?
DynamoDB Streams capture table activity (inserts, updates, and deletes) and store the change records in a stream. You can attach an AWS Lambda function to the stream so it automatically gets invoked when changes occur.
Use cases include:
Real-time analytics
Auditing and logging
Replicating data to other systems
Enabling Streams on a Table
You can enable streams using the AWS Console, CLI, or CloudFormation. For example, using AWS CLI:
aws dynamodb update-table \
--table-name MyTable \
--stream-specification StreamEnabled=true,StreamViewType=NEW_AND_OLD_IMAGES
The StreamViewType
determines what data is captured:
KEYS_ONLY
NEW_IMAGE
OLD_IMAGE
NEW_AND_OLD_IMAGES
(recommended for most cases)
Creating a .NET Lambda to Process Stream Records
Let's build a .NET Lambda that listens to a DynamoDB Stream and processes the records.
1. Create a New Lambda Project
Use the AWS Lambda template:
dotnet new lambda.DynamoDBEventFunction -n DynamoDbStreamConsumer
cd DynamoDbStreamConsumer
2. Update the Function Handler
In Function.cs
You’ll find a method like this:
public async Task FunctionHandler(DynamoDBEvent dynamoEvent, ILambdaContext context)
{
foreach (var record in dynamoEvent.Records)
{
context.Logger.LogInformation($"Event ID: {record.EventID}");
context.Logger.LogInformation($"Event Name: {record.EventName}");
if (record.Dynamodb.NewImage != null)
{
var item = Document.FromAttributeMap(record.Dynamodb.NewImage);
context.Logger.LogInformation($"New item: {item.ToJsonPretty()}");
}
if (record.Dynamodb.OldImage != null)
{
var oldItem = Document.FromAttributeMap(record.Dynamodb.OldImage);
context.Logger.LogInformation($"Old item: {oldItem.ToJsonPretty()}");
}
}
}
You can deserialize NewImage
and OldImage
to your model if needed.
3. Deploy the Lambda Function
You can deploy the function using the AWS Toolkit for Visual Studio or with the CLI:
dotnet lambda deploy-function DynamoDbStreamConsumer
4. Attach the Stream to Lambda
After deployment, link the DynamoDB stream to the Lambda:
aws lambda create-event-source-mapping \
--function-name DynamoDbStreamConsumer \
--event-source arn:aws:dynamodb:region:account-id:table/MyTable/stream/timestamp \
--starting-position LATEST \
--batch-size 10
This sets up the Lambda to process stream records in batches of 10.
Error Handling and Retries
If your Lambda throws an exception, the batch is retried until it succeeds or expires. To avoid poison pills:
Use a dead-letter queue (DLQ)
Enable partial batch response to skip bad records
Logging and Monitoring
Use Amazon CloudWatch Logs to debug and monitor stream processing. You can also add structured logging with Serilog or Microsoft.Extensions.Logging.
Conclusion
Using DynamoDB Streams with .NET Lambda functions allows you to build reactive, event-driven applications with minimal overhead. Whether you're tracking changes, replicating data, or triggering downstream processes, this integration is a powerful tool in your AWS toolkit.
References
Subscribe to my newsletter
Read articles from Renato Ramos Nascimento directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Renato Ramos Nascimento
Renato Ramos Nascimento
With over 14 years in software development, I specialize in backend systems using .NET, Python, and Java. I bring full lifecycle expertise, including requirements analysis, client/server and data layer development, automated testing (unit, integration, end-to-end), and CI/CD implementations using Docker, GitLab Pipelines, GitHub Actions, Terraform, and AWS CodeStar.