Effortless Video Uploads: Building a Scalable Next.js App with AWS S3 Multipart Magic🪣
Table of contents
- Introduction
- Understanding Multipart Upload with AWS S3
- Implementing the Video Upload Feature
- Frontend Development Using Next.js
- Creating the Video Upload Form and Handling User Input and Validation
- Implementing the Upload File function to handle Multipart upload in Nextjs.
- Using the Upload form component in the /upload route.
- Backend API Integration
- Designing the API Endpoints
- Configuring Server-side Logic for Multipart Upload
- Leveraging AWS SDK for Multipart Upload
- Handling Upload Errors and Retries
- Conclusion
Introduction
Overview of the Project
This article guides you through building a scalable Next.js application for video uploads using AWS S3's Multipart Upload feature. It covers setting up AWS credentials, understanding the benefits and process of Multipart Upload, and implementing the video upload feature with Next.js. The article also discusses error handling, optimization, and security considerations to ensure a robust and efficient upload system.
Importance of Scalable File Storage
In today's cloud-driven world, scalability ensures that the application can handle a growing user base, without worrying about storage limitations. By integrating AWS S3 with Multipart Upload, your application demonstrates best practices for handling large-scale file storage efficiently.
Introduction to Next.js and AWS S3
Next.js: It’s a popular React framework that enables server-side rendering, static site generation, and API routes, making it an ideal choice for building fast, SEO-friendly, and scalable web applications. It simplifies the development of user interfaces while offering excellent flexibility and performance.
AWS S3 (Simple Storage Service): A highly scalable cloud storage service that enables developers to store and retrieve data, such as video files, from anywhere on the web. S3 offers low-latency, highly available storage, and is designed to handle large-scale storage needs. With features like Multipart Upload, it allows for the efficient uploading of large files, which is essential for video streaming and content-heavy applications.
Setting Up AWS Credentials and S3 Bucket
Create an AWS Account
If you don’t already have an AWS account, sign up at AWS. AWS offers a free tier with sufficient resources to get started with small projects.
Set Up an S3 Bucket
Go to S3 service and create a new bucket
Give your bucket a globally unique name (e.g.,
my-nextjs-videos
).Choose a region close to your base for low latency
Leave the default settings for Block Public Access (or configure as needed)
Enable Versioning and Server-side Encryption if needed.
Click Create Bucket.
Create an IAM User and Assign Permissions
To securely interact with AWS services, you'll need an IAM (Identity and Access Management) user with the necessary permissions.
Navigate to IAM
Create a New IAM User:
Go to Users and click Add users .
Enter a username like S3AccessUser.
Select Programmatic access to generate an Access Key ID and Secret Access Key.
Attach Permissions:
On the next screen, choose “Attach existing policies directly.”
Search for and attach the AmazonS3FullAccess policy.
This is the kind of interface you’ll have after following the above steps
Understanding Multipart Upload with AWS S3
What is Multipart Upload?
Multipart Upload is a feature of AWS S3 that allows you to upload large files by breaking them into smaller parts and uploading them independently, in parallel. This method is especially useful when dealing with large files, such as video content, that might otherwise fail or take too long to upload as a single request. Let’s dive deep into this feature.
Benefits of Using Multipart Upload
Improved Upload Performance: Multipart Upload allows you to upload large files in smaller parts, which can be uploaded in parallel, significantly improving the overall upload speed and efficiency.
Increased Reliability: By uploading files in parts, Multipart Upload reduces the risk of failure. If a part fails to upload, it can be retried without affecting the entire upload process.
Flexibility in Resuming Uploads: Multipart Upload provides the ability to pause and resume uploads, which is particularly useful for handling network interruptions or other disruptions during the upload process.
How Multipart Upload Works in S3
1. User Sends a Video Upload Request
The process begins when a user selects a video file to upload. Along with the file, the user also sends the necessary file metadata (such as the file name, size, and format) to the server. This metadata helps the server understand how to handle and store the file.
2. Server Receives Metadata and Initiates Multipart Upload
Once the server receives the video upload request and file metadata, it prepares to upload the file to AWS S3. Since the file might be large, the server opts for Multipart Upload, where the file is divided into smaller, manageable chunks. At this stage, the server sends an initial request to AWS S3 to initiate the multipart upload and gets back an Upload ID.
3. Upload ID Generation
Upon receiving the request, AWS S3 generates and returns an Upload ID to the server. This ID is essential for tracking the file’s upload progress and helps ensure all the parts are correctly assembled later.
4. Uploading File Parts
With the Upload ID, the server begins uploading each chunk (or part) of the file to AWS S3. This parallel upload process speeds up the entire transfer, as multiple parts can be uploaded simultaneously rather than one large file in a single stream. Each part is identified by its Etag and Part Number, ensuring AWS S3 can later assemble the parts in the correct sequence.
5. Complete Upload Request
Once all the file parts have been successfully uploaded, the server sends a final request to AWS S3 to complete the upload. This request informs AWS S3 that all parts of the file are uploaded and ready for assembly into a single object.
6. Receiving the Location of the Object
After AWS S3 assembles the parts into the original file, it responds to the server with the location (URL) where the video file is stored. The server can now return this location to the user or any other service that requires access to the uploaded file.
Implementing the Video Upload Feature
Frontend Development Using Next.js
The first step is to build a simple and user-friendly form that allows users to upload their video files. Here’s an example of how you can implement it in Next.js:
npx create-next-app @latest my-app
cd my-app
Creating the Video Upload Form and Handling User Input and Validation
// app/components/VideoUploadForm.ts
'use client';
import { useState, FormEvent, useCallback } from 'react';
import { useRouter } from 'next/navigation';
const CHUNK_SIZE = 5 * 1024 * 1024; // 5MB Chunk
interface UploadPart{
ETag:string;
PartNumber:number;
}
export default function VideoUploadForm() {
const [formData, setFormData] = useState({
title: '',
videoFile: null as File | null,
});
const [loading, setLoading] = useState(false);
const [uploadProgress, setUploadProgress] = useState(0);
//Handle submit
const handleSubmit = async (e: FormEvent) => {
e.preventDefault();
setLoading(true);
try {
if (!formData.videoFile) {
throw new Error('No video file selected');
}
// Upload video file
const uploadResult = await uploadFile(formData.videoFile);
// video metadata to database
router.push('/');
} catch (error) {
console.error('Error during upload:', error);
} finally {
setLoading(false);
}
};
return (
//Title of the video
<form onSubmit={handleSubmit} className="max-w-2xl mx-auto p-6 space-y-6">
<div>
<label className="block text-sm font-medium text-gray-700">Title</label>
<input
type="text"
required
className="mt-1 block w-full rounded-md border-gray-300 shadow-sm focus:border-indigo-500 focus:ring-indigo-500"
value={formData.title}
onChange={(e) => setFormData({ ...formData, title: e.target.value })}
/>
</div>
<div>
<label className="block text-sm font-medium text-gray-700">Video File</label>
<input
type="file"
accept="video/*"
required
className="mt-1 block w-full"
onChange={(e) => setFormData({
...formData,
videoFile: e.target.files ? e.target.files[0] : null,
})}
/>
</div>
{uploadProgress > 0 && uploadProgress < 100 && (
<div className="w-full bg-gray-200 rounded-full h-2.5">
<div
className="bg-indigo-600 h-2.5 rounded-full"
style={{ width: `${uploadProgress}%` }}
></div>
</div>
)}
<button
type="submit"
disabled={loading}
className="w-full bg-indigo-600 text-white py-2 px-4 rounded-md hover:bg-indigo-700 disabled:bg-gray-400"
>
{loading ? 'Uploading...' : 'Upload Video'}
</button>
</form>
);
}
Now we can use this component in our Upload page.tsx to upload a video
Implementing the Upload File function to handle Multipart upload in Nextjs.
// Upload File
const uploadFile = useCallback(async (file: File) => {
try {
// Step 1: Initiate multipart upload
const initiateResponse = await fetch('http://localhost:8080/upload/initialize', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
filename: file.name,
mimeType: file.type,
}),
});
if (!initiateResponse.ok) throw new Error('Failed to initiate upload');
const { uploadId, key } = await initiateResponse.json();
// Step 2: Upload parts
const parts: UploadPart[] = [];
const chunks = Math.ceil(file.size / CHUNK_SIZE);
// For each part
for (let partNumber = 1; partNumber <= chunks; partNumber++) {
const start = (partNumber - 1) * CHUNK_SIZE;
const end = Math.min(start + CHUNK_SIZE, file.size);
const chunk = file.slice(start, end);
// Create form data for this chunk
const formData = new FormData();
formData.append('chunk', chunk);
formData.append('uploadId',uploadId);
formData.append('partNumber', partNumber.toString()); // Strings are expected for form data
formData.append('key', key);
const uploadPartResponse = await fetch('http://localhost:8080/upload/upload-part', {
method: 'POST',
body: formData,
});
if (!uploadPartResponse.ok) {
await fetch('http://localhost:8080/upload/abort',{
method:'POST',
body:JSON.stringify({
uploadId,
key
})
})
.then(()=>{alert('Error in upload a specific part of the video')});
throw new Error(`Failed to upload part ${partNumber}`);
}
else{
const { ETag } = await uploadPartResponse.json();
parts.push({ ETag, PartNumber: partNumber });
// Update progress
setUploadProgress((partNumber / chunks) * 100);
}
}
// Step 3: Complete multipart upload
const completeResponse = await fetch('http://localhost:8080/upload/complete', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
uploadId,
key,
parts
}),
});
if (!completeResponse.ok) throw new Error('Failed to complete upload');
return await completeResponse.json();
} catch (error) {
console.error('Upload failed:', error);
throw error;
}
}, []);
};
Using the Upload form component in the /upload route.
//app/components/upload/page.tsx
'use client';
import { useRouter } from 'next/navigation';
import VideoUploadForm from '@/app/components/VideoUploaderForm';
export default function UploadPage() {
return (
<div className="container mx-auto px-4 py-8">
<h1 className="text-3xl font-bold mb-8">Upload a New Video</h1>
<VideoUploadForm />
</div>
);
}
Backend API Integration
For the backend implementation I’ve used Node.js as my server side language and along with MVC(Model View Controller) architecture.
//package.json
{
"dependencies": {
"aws-sdk": "^2.1691.0",
"cors": "^2.8.5",
"dotenv": "^16.4.5",
"express": "^4.21.0",
"jsonwebtoken": "^9.0.2",
"multer": "^1.4.5-lts.1",
"user-service": "file:"
},
"devDependencies":{
"@types/aws-sdk": "^0.0.42",
"@types/cors": "^2.8.17",
"@types/dotenv": "^6.1.1",
"@types/express": "^5.0.0",
"@types/multer": "^1.4.12",
"ts-node": "^10.9.2",
"tsc-watch": "^6.2.0",
"typescript": "^5.6.2",
},
"name": "user-service",
"version": "1.0.0",
"main": "index.js",
"scripts": {
"start": "node dist/index.js",
"build": "tsc -p .",
"dev": "tsc-watch --onSuccess \"node dist/index.js\""
},
"keywords": [],
"author": "",
"license": "ISC",
"description": ""
}
package.json file for all the required dependencies.
Designing the API Endpoints
To implement AWS Multipart upload on the server side we need to implement 4 api’s
→ Initialize Upload.
→ Upload Chunk.
→ Complete Upload.
→ Abort Upload in case of failure.
//src/routes/upload.routes.ts
import { Router } from 'express';
import multer from 'multer';
import {
initiateMultipartUpload,
uploadPart,
completeMultipartUpload,
abortMultipartUpload,
} from '../controllers/upload.controller';
const router = Router();
// Setting up multer for file handling, with a 10MB file size limit per chunk
const upload = multer({
limits: {
fileSize: 5 * 1024 * 1024,
}
});
// Route to initialize multipart upload and generate an Upload ID
router.post('/initialize',upload.none(),initiateMultipartUpload);
// Route to upload individual file chunks to S3
router.post('/upload-part', upload.single('chunk'), uploadPart);
// Route to complete the upload process after all parts are uploaded
router.post('/complete', completeMultipartUpload);
// Route to abort the multipart upload in case of failure or cancellation
router.post('/abort', abortMultipartUpload);
export default router;
Here we’ve implement 4 routes to
Initialize a multipart upload.
Upload an individual part .
After all the parts are uploaded a complete upload endpoint.
An about upload endpoint to stop an upload in cause of issue or failure.
Configuring Server-side Logic for Multipart Upload
// src/controller/upload.controller.ts
import AWS from 'aws-sdk';
const s3 = new AWS.S3({
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_REGION,
});
const BUCKET_NAME = process.env.AWS_BUCKET_NAME;
To enable multipart upload we need an IAM user who has S3 Access to use the multipart upload.
As mentioned in the above
We must have the following credentials:
AWS Access ID of the IAM user with S3 Usage Access.
AWS Access Secret of the above user.
Bucket Name.
Region of the bucket.
Leveraging AWS SDK for Multipart Upload
By importing the AWS SDK package in our application we can use the AWS’s multipart upload API.
npm i aws-sdk
npm i aws-sdk -D
Initiating Multipart Upload
The first api call to the AWS S3 Multipart upload is the initialise upload request, which can be implemented in the following manner.
// src/controller/upload.controller.ts
interface InitiateUploadRequest extends Request {
body: {
userId: string;
filename: string;
mimeType: string;
};
}
export const initiateMultipartUpload = async (
req: InitiateUploadRequest,
res: Response
): Promise<void> => {
const { filename, mimeType, userId } = req.body;
try {
const params = {
Bucket: BUCKET_NAME!,
Key: `videos/${userId}/${Date.now()}-${filename}`,
ContentType: mimeType,
};
await s3.createMultipartUpload(params)
.promise()
.then((data)=> {
res.status(200).json({
uploadId: data.UploadId,
key: data.Key,
});
console.log("Initialized multipart upload with upload id",data.UploadId)
})
.catch((err:AWSError)=>{
res.status(500).json({
message:'Failed to initialize multipart upload from AWS'
});
console.log("Failed to initialize multipart upload from AWS",err.message)
});
return;
} catch (error) {
console.error('Error initiating multipart upload', error);
res.status(500).json({ error: 'Error initiating multipart upload' });
}
};
Uploading Video Parts to S3
After a successful initialise request we receive an upload_id as a response for the above request. The upload_id is a unique identifier for that particular upload request and is used throughout the upload process.
Now using that specific upload_id we upload each and every part of the file along with the part number and the key (location of the file in the bucket).
interface UploadPartRequest extends Request {
body: {
uploadId: string;
partNumber: string;
key: string;
};
}
export const uploadPart = async (
req: UploadPartRequest,
res: Response
): Promise<void> => {
const { uploadId, partNumber, key } = req.body;
if(!req.file){
res.status(400).json({error:"No file found"});
return;
}
try {
const params = {
Bucket: BUCKET_NAME!,
Key: key,
PartNumber: parseInt(partNumber,10),
UploadId: uploadId,
Body: req.file.buffer,
};
await s3.uploadPart(params)
.promise()
.then((data)=>{
res.status(200).json({
ETag: data.ETag,
PartNumber: parseInt(partNumber, 10)
});
})
.catch((err:AWSError)=>{
res.status(500).json({
message:'Error to upload part '+partNumber+' from AWS'+ err.message
});
});
return;
} catch (error) {
console.error('Error uploading part', error);
res.status(500).json({ error: 'Error uploading part' });
}
};
Completing the Multipart Upload
Once all the parts are uploaded successfully, we send a complete-upload request to the Multipart upload API mentioning that we’ve uploaded all part of the file.
In the request body we send
The bucket name
Key (Location of the file stored)
The upload_id
Parts which is in an array of E-tag and their associated Part Number so that it could have all the parts and make sure the parts are aligned in the proper manner.
interface CompleteUploadRequest extends Request {
body: {
uploadId: string;
key: string;
parts: Array<{ ETag: string; PartNumber: number }>;
};
}
// Complete multipart upload
export const completeMultipartUpload = async (
req: CompleteUploadRequest,
res: Response
): Promise<void> => {
const { uploadId, key, parts } = req.body;
try {
if (!uploadId || !key || !parts || parts.length === 0) {
res.status(400).json({ error: 'Missing required fields or parts' });
return;
}
const params = {
Bucket: BUCKET_NAME!,
Key: key,
UploadId: uploadId,
MultipartUpload: {
Parts: parts.map((part) => ({
ETag: part.ETag,
PartNumber: part.PartNumber,
})),
},
};
const completeMultipartUploadResponse = await s3.completeMultipartUpload(params).promise();
//Can send the video to a transcoder service
res.status(200).json({
message: 'Upload completed successfully',
location: completeMultipartUploadResponse.Location,
});
return;
} catch (error) {
console.error('Error completing multipart upload', error);
res.status(500).json({ error: 'Error completing multipart upload' });
}
};
Aborting the Multipart Upload
The AWS Multipart Upload API also provides an abort upload functionality which can abort/cancel the upload process in case of any interruption or failure.
For this API we would require:
The upload_id of the request.
Key Location.
interface AbortUploadRequest extends Request {
body: {
uploadId: string;
key: string;
};
}
export const abortMultipartUpload = async (
req: AbortUploadRequest,
res: Response
): Promise<void> => {
const { uploadId, key } = req.body;
try {
const params = {
Bucket: BUCKET_NAME!,
Key: key,
UploadId: uploadId,
};
await s3.abortMultipartUpload(params).promise();
res.status(200).json({ message: 'Multipart upload aborted successfully' });
} catch (error) {
console.error('Error aborting multipart upload', error);
res.status(500).json({ error: 'Error aborting multipart upload' });
}
};
Handling Upload Errors and Retries
It's important to have error handling mechanisms to ensure our APIs are robust and the server remains stable.
Error Handling and Optimization
To make our upload system more robust, we should implement error handling and optimization techniques:
Retry Logic: Implement a retry mechanism for failed part uploads.
Abort Upload: Create an API route to abort the upload if too many retries fail.
Optimize Chunk Size: Adjust the chunk size based on network conditions and file size.
Parallel Uploads: Implement concurrent part uploads to speed up the process for large files.
Security considerations and Enhancements
To secure and enhance our upload system:
Authentication: Implement user authentication to ensure only authorized users can upload files.
Signed URLs: Use pre-signed URLs for more secure S3 uploads.
File Validation: Implement client-side file type and size validation.
Conclusion
Building a scalable video upload system with Next.js and AWS S3 Multipart Upload provides a robust solution for handling large file uploads. By breaking down the upload process into manageable chunks, we can create a more reliable and efficient system.
Subscribe to my newsletter
Read articles from Raman Singh directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by