Picture this: One fine day, you’ve built yourself a nice lil Next.js app that uses Puppeteer to generate some PDF. Everything works perfectly (locally). PDFs generate beautifully, you are happy and thinking that your mom must be proud. You spent more time building other things about the app. Then you deploy to Vercel cuz, what option you have, and then
Error: Could not find Chrome (ver. 138.0.7204.94).**
This can occur if either:
1. you did not perform an installation before running the script
2. your cache path is incorrectly configured
This will definitely make you question your self-worth. Speaking of me, I had no clue that Chrome binary must be related to Puppeteer, so I did some digging and found out that I’m using Puppeteer, which is not available in Vercel’s serverless env. I was very frustrated to find this out. cuz when I chatGPTed it said change your entire code from using Puppeteer, which doesn’t work on the serverless runtime, to React PDF or External Services that charge like crazy amounts for their Puppeteer API. I was about to give up, then I decided to look it up on YouTube, and thanks to a guy, I found this video and an article from his video description. so enough whining, let's cut to the chase. What I did,
The Initial Setup (What Seemed to Work Locally)
Our original setup was straightforward — a typical Next.js app with Puppeteer for PDF generation:
// Original working local setup
import puppeteer from "puppeteer";
export async function generateCertificatePDF(data: CertificateData): Promise<Buffer> {
const browser = await puppeteer.launch({
headless: true,
args: [
"--no-sandbox",
"--disable-setuid-sandbox",
"--disable-web-security",
],
});
const page = await browser.newPage();
await page.setContent(htmlContent);
const pdfBuffer = await page.pdf({ format: "A4" });
await browser.close();
return pdfBuffer;
}
Why it worked locally:
- Puppeteer automatically downloads and bundles Chromium (~300MB)
- My local machine has plenty of storage and memory
- No bundle size restrictions
Why it failed on Vercel:
- Vercel has a 50MB function bundle size limit
- Puppeteer’s bundled Chromium is ~300MB (6x over the limit!)
- Serverless environments don’t include Chrome/Chromium by default
The Critical Database Corruption Issue
But the worst part wasn’t just the Chrome error. My app had a critical bug:
// DANGEROUS: This was updating the database even when PDF generation failed!
await updateInternCertificateKey(intern.email, s3Key); // s3Key was NULL!
When PDF generation failed, my database was still being updated with NULL S3 Links. This meant:
- Users couldn’t regenerate PDF (app thought they already had one)
- App logic broke (tried to fetch NULL S3 links)
- Production was corrupted with bad data
Solution Exploration: The Swing Between Approaches
Approach 1: React PDF (The Serverless-Native Alternative)
My first instinct was to suggest ditching Puppeteer entirely for @react-pdf/renderer:
Pros:
- Serverless-friendly (~4MB bundle)
- No browser dependencies
- Fast and reliable
- Perfect for structured documents like certificates
Cons:
- Requires rewriting all PDF generation logic
- Less flexible than full browser rendering
// React PDF approach - clean but requires rewrite
import { pdf, Document, Page, Text } from '@react-pdf/renderer';
const MyDocument = () => (
<Document>
<Page>
<Text>Certificate content...</Text>
</Page>
</Document>
);
const buffer = await pdf(<MyDocument />).toBuffer();
Approach 2: External Browser Services
Services like Browserless.io provide remote Chrome instances:
Pros:
- No local Chrome needed
- Powerful and full-featured
- Handles all the serverless complexity
Cons:
- Costs money ($200+ per month)
- External dependency
- Network latency
- Not suitable for free/low-cost projects
// Browserless approach - works but costs money
const browser = await puppeteer.connect({
browserWSEndpoint: `wss://chrome.browserless.io?token=${process.env.BLESS_TOKEN}`
});
Approach 3: Serverless Chromium
The community-maintained @sparticuz/chromium emerged as the "just right" solution, and thanks to the YouTube video mentioned above, I saved myself from the wrong spiral of using deprecated packages and wrong binaries cuz GPT says so.
Pros:
- Designed specifically for serverless
- Fits within bundle size limits
- Keeps existing Puppeteer code
- Free and open source
- Actively maintained
Cons:
- ⚠️ Setup complexity
- ⚠️ Performance is slower than local Chrome
- ⚠️ Bundle size edge cases
The Implementation Journey
Step 1: Package Installation
# Remove the heavyweight
npm uninstall puppeteer
# Install serverless-friendly versions
npm install @sparticuz/chromium puppeteer-core
# Keep puppeteer for local development
npm install --save-dev puppeteer
Step 2: Environment-Aware Browser Launch
The key insight was to use different browser launch strategies for development vs. production:
async function getBrowser() {
if (process.env.NODE_ENV === 'production') {
// Production: Serverless Chromium
const [chromiumModule, puppeteerModule] = await Promise.all([
import('@sparticuz/chromium'),
import('puppeteer-core')
]);
const chromium = chromiumModule.default;
const puppeteer = puppeteerModule.default;
return puppeteer.launch({
args: [...chromium.args, '--no-sandbox', '--disable-setuid-sandbox'],
defaultViewport: chromium.defaultViewport,
executablePath: await chromium.executablePath(),
headless: chromium.headless,
ignoreHTTPSErrors: true,
});
} else {
// Local: Full Puppeteer with bundled Chrome
const puppeteerModule = await import('puppeteer');
const puppeteer = puppeteerModule.default;
return puppeteer.launch({
headless: true,
args: ["--no-sandbox", "--disable-setuid-sandbox"],
});
}
}
Step 3: Next.js Configuration Hell
Next.js needed to be told how to handle these packages:
// next.config.ts - Updated for Next.js 15
const nextConfig: NextConfig = {
// This moved in Next.js 15!
serverExternalPackages: ['puppeteer-core', '@sparticuz/chromium'],
webpack: (config, { isServer }) => {
if (isServer) {
config.externals.push('@sparticuz/chromium');
}
return config;
},
};
So in short, keep your deployment environment in mind when using libraries.