Skip to content

How can I improve serverless function (lambda) cold start performance on Vercel?

Vercel Serverless Functions enable developers to write server-side logic to connect to their database or external APIs. If you are seeing slow response times, this guide will help you identify the root issue for the additional latency.

When a Serverless Function starts for the first time, it’s a cold start. Subsequent requests to that Function are then considered warm. Cold starts typically occur in under 1% of invocations. The duration of a cold start varies from under 100 ms to over 1 second.

This guide will help you improve the performance of your Functions and understand how to determine if the latency increase is from a cold start.

Improving your Function performance

The following suggestions will help you ensure optimal performance of your Vercel Serverless Functions:

  1. Choose the correct region for your functions: Vercel Serverless Functions are deployed to us-east by default. All customers can change the default region for their functions in their project settings. Choose a region that’s closest to your data source for optimal performance.
  2. Choose smaller dependencies inside your functions: Cold start times are correlated to function size, which is often mostly from external dependencies. If you have large dependencies, parsing and evaluating JavaScript code can take 3-5 seconds or longer. Review your bundle and try to eliminate larger dependencies using a bundle analyzer.
  3. Use proper caching headers: Serverless Function responses can be cached using Cache-Control headers. This will help ensure optimal performance for repeat visitors, and Vercel’s Edge cache even supports stale-while-revalidate headers. Note that cache misses will still need to request data from your origin (e.g. database) rather than reading directly from the Edge cache (faster).

Faster performance with Vercel Edge Functions

Vercel now also has support for Edge Functions, which use a lightweight Node.js runtime, enabling greater performance with near-zero cold starts.

By opting into the constraints of the Edge Runtime, you can eliminate the overhead of the full Node.js runtime booting. Edge Functions run on top of the JavaScript V8 engine, so there is not access to all Node.js APIs such as processpath, or fs when developing with Edge Functions.

1
// api/hello.js
2
3
export const config = {
4
runtime: 'edge',
5
}
6
7
export default (req) => new Response('Hello world!')
An Edge API Route using Vercel Edge Functions.

Couldn't find the guide you need?