API Gateway Payload Limit Solution: Using Pre-Signed S3 URLs

Saurabh AdhauSaurabh Adhau
3 min read

Introduction

Amazon API Gateway is a powerful service that allows developers to create and manage APIs at scale. However, one major limitation is the 10MB payload size limit for requests. This can be a problem when users need to upload large files.

This article explains the best way to bypass the API Gateway payload limit using pre-signed S3 URLs, ensuring efficient and scalable file uploads.

Understanding the Problem

Poor Architecture: Direct Upload via API Gateway

A common but inefficient approach is to send file uploads directly through API Gateway before storing them in an S3 bucket.

How It Works:

  1. The mobile client sends a file upload request to API Gateway.

  2. API Gateway attempts to process and forward the file to an S3 bucket.

  3. If the file is greater than 10MB, the request fails due to API Gateway’s payload limit.

Problems with This Approach:

Payload Limit – Files larger than 10MB are rejected.
High Costs – API Gateway charges based on request size, making this approach costly for large uploads.
Increased Latency – API Gateway acts as a middle layer, slowing down the process.

Image preview

Better Architecture: Using Pre-Signed S3 URLs

A more efficient method is to use pre-signed S3 URLs, which allow clients to upload files directly to S3, bypassing API Gateway’s size limits.

How This Works:

  1. The mobile client sends a request to API Gateway.

  2. API Gateway invokes a Lambda function.

  3. Lambda generates a pre-signed S3 URL.

  4. Lambda returns the pre-signed URL to API Gateway.

  5. API Gateway forwards the URL to the mobile client.

  6. The mobile client uploads the file directly to S3 using the pre-signed URL.

Why This Works Better

Bypasses API Gateway Payload Limit – The file is uploaded directly to S3, avoiding the 10MB restriction.
Reduces Costs – API Gateway and Lambda handle only small requests (to generate the URL), reducing data transfer costs.
Improves Performance – Direct uploads to S3 are faster and more efficient.
Highly Scalable – This approach can handle large file uploads seamlessly.

Implementation Example

Step 1: Create a Lambda Function to Generate a Pre-Signed URL

The following Python code (using Boto3) generates a pre-signed URL for an S3 bucket:

import boto3
import json
import os

def lambda_handler(event, context):
    s3 = boto3.client('s3')
    bucket_name = os.environ['BUCKET_NAME']
    file_name = event['queryStringParameters']['file_name']

    pre_signed_url = s3.generate_presigned_url(
        'put_object',
        Params={'Bucket': bucket_name, 'Key': file_name},
        ExpiresIn=3600  # URL valid for 1 hour
    )

    return {
        'statusCode': 200,
        'body': json.dumps({'upload_url': pre_signed_url})
    }

Step 2: Deploy API Gateway to Invoke Lambda

  1. Create an API Gateway HTTP endpoint.

  2. Configure an integration request to invoke the Lambda function.

  3. Deploy the API Gateway.

Step 3: Upload File Using the Pre-Signed URL

Once the mobile client receives the pre-signed URL, it can upload the file directly to S3:

curl -X PUT -T "file.txt" "<pre-signed-URL>"

Conclusion

The traditional approach of uploading files via API Gateway is inefficient and impractical due to the 10MB payload limit. Using pre-signed S3 URLs offers a scalable, cost-effective, and high-performance solution for handling file uploads.

By following this approach, you can improve user experience, reduce costs, and ensure a smooth file upload process for your applications.

10
Subscribe to my newsletter

Read articles from Saurabh Adhau directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Saurabh Adhau
Saurabh Adhau

As a DevOps Engineer, I thrive in the cloud and command a vast arsenal of tools and technologies: ☁️ AWS and Azure Cloud: Where the sky is the limit, I ensure applications soar. 🔨 DevOps Toolbelt: Git, GitHub, GitLab – I master them all for smooth development workflows. 🧱 Infrastructure as Code: Terraform and Ansible sculpt infrastructure like a masterpiece. 🐳 Containerization: With Docker, I package applications for effortless deployment. 🚀 Orchestration: Kubernetes conducts my application symphonies. 🌐 Web Servers: Nginx and Apache, my trusted gatekeepers of the web.