Running Puppeteer Seamlessly in Serverless Platforms like Vercel

Table of contents

It was this morning when I came to know that puppeteer in its current form doesn’t work out of the box in platforms like Vercel (or AWS-Lambda but I don’t have personal experience with that, just Vercel for now).
As I needed to convert HTML to PDF using Puppeteer for https://trackjobs.online , I set up things and pushed code for deployment, but it didn’t worked.
After some google search I found this thread : https://github.com/orgs/vercel/discussions/124 . This is the most comprehensive discussion on this subject. There I got the idea of size limit in Vercel for server-less function is 50 MB.
To overcome this there are few options around us :
Use Chromium from https://github.com/Sparticuz/chrome-aws-lambda but it is deprecated and hence not ideal to use.
Some companies offer puppeteer as a service which is quite expensive and impractical.
Use this : https://github.com/sparticuz/chromium (made by the same developer as the first one i.e, chrome-aws-lamda).
But things aren’t as rosy as we might like and I got very weird issue even after following the steps. As you can probably tell that, it is missing the executable path for chromium.
So I found this article: https://www.stefanjudis.com/blog/how-to-use-headless-chrome-in-serverless-functions/ and it kinda solved my problem but not entirely as it was bit outdated.
We will be using the chromium-min and puppeteer-core.
The @sparticuz/chromium-min
package provides a minimal version of Chromium tailored for serverless platforms, addressing deployment constraints such as package size limitations. Unlike the standard @sparticuz/chromium
package, which includes the Chromium binary compressed with Brotli, chromium-min
omits these large binary files. This design allows you to manage the Chromium binary separately, facilitating compliance with platform-specific size restrictions.
To utilize chromium-min
, you must supply the Chromium binary from an external source. This can be achieved by hosting the necessary files on a service like AWS S3 or a CDN and configuring the application to download them at runtime. This approach ensures that the deployment package remains within size limits while still providing access to the required Chromium executable.
puppeteer-core
is a lightweight version of the library designed for developers who need to control browsers supporting the DevTools Protocol without the overhead of downloading a specific browser version. It does not include a bundled browser, granting developers the flexibility to connect to existing browser installations or remote instances.
So as of 14th December 2024, I found the solution and it is working for me.
Some ground assumptions from my side:
I am using Next.js 14.2.x with app router.
Node.js version is 20+ .
import { NextResponse, type NextRequest } from "next/server";
import puppeteer, { type Browser } from 'puppeteer';
import puppeteerCore, { type Browser as BrowserCore } from 'puppeteer-core';
import chromium from '@sparticuz/chromium-min';
export const dynamic = 'force-dynamic';
export const maxDuration = 60;
export async function GET(request: NextRequest) {
try {
// An example for Logic
const searchParams = request.nextUrl.searchParams;
const generatedResumeID = searchParams.get("generated_resume_id");
if (!generatedResumeID) return NextResponse.json({ error: "Invalid request" }, { status: 400 });
// finalHTML comes from my code's logic (you can just ignore it here)
// THE CORE LOGIC
let browser: Browser | BrowserCore;
if (process.env.NODE_ENV === 'production' || process.env.VERCEL_ENV === 'production') {
// Configure the version based on your package.json (for your future usage).
const executablePath = await chromium.executablePath('https://github.com/Sparticuz/chromium/releases/download/v131.0.1/chromium-v131.0.1-pack.tar')
browser = await puppeteerCore.launch({
executablePath,
// You can pass other configs as required
args: chromium.args,
headless: chromium.headless,
defaultViewport: chromium.defaultViewport
})
} else {
browser = await puppeteer.launch({
headless: true,
args: ['--no-sandbox', '--disable-setuid-sandbox']
})
}
const page = await browser.newPage();
await page.setContent(finalHTML, {
waitUntil: 'networkidle0'
});
const pdf = await page.pdf({
format: 'A4',
printBackground: true,
margin: {
top: '10px',
right: '10px',
bottom: '10px',
left: '10px'
}
});
await browser.close();
return new NextResponse(pdf, {
status: 200,
headers: {
'Content-Type': 'application/pdf',
'Content-Disposition': `attachment; filename=${resumeTitle}.pdf`,
},
});
} catch (error) {
console.error('PDF generation error:', error);
return NextResponse.json(
{ message: 'Error generating PDF' },
{ status: 500 }
);
}
}
Here essentially we kind of using a trick of using puppeteer-core instead of the normal one and use chromium-min instead of chromium. It is light-weight and does the job.
I haven’t dived into the technicalities as implying that you understand the basics and how to configure some of the things.
Here’s my versions of the tools right from my package.json
.
"puppeteer": "^23.10.4",
"puppeteer-core": "^23.10.4",
"@sparticuz/chromium-min": "^131.0.1",
"next": "^14.2.20",
The basic idea is to limit the size and still ship the usable chromium. You can also your own CDN for even faster delivery, but for me GitHub’s CDN is good enough.
Installation guide: https://github.com/Sparticuz/chromium?tab=readme-ov-file#install
This would be good enough for you to start.
Update 1 : 31st March 2025
So, Vercel introduced a new thing called Fluid Compute which is something like saving resources thing.
I don’t buy into that much, still I enabled it in my project and it stopped working (the puppeteer part). I got an error.
The solution to this error is pretty simple. Upgrade the chromium-min
to v133.0.0
and so the .tar executable’s path to https://github.com/Sparticuz/chromium/releases/download/v133.0.0/chromium-v133.0.0-pack.tar (earlier it was 131 something)
"@sparticuz/chromium-min": "^133.0.0"
and although things were working but it still logs some errors. So this is a kind of a patch. Also for a safe side upgrade puppeteer
to v24.4.0
but I am still on v23.10.4
(maybe that would take the persistent errors away).
Extras:
https://github.com/ozgrozer/x-image
(It tries to implement the same thing in pages router, not sure how it works but you give a try).
Subscribe to my newsletter
Read articles from Kundan directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Kundan
Kundan
Learning.