I thought Vercel's backend was just like another backend...
Source: Dev.to
Lesson learned about Vercel vs. Traditional Backend
On my journey to learn Next.js in depth through projects, I tried to architect my Next.js backend server like a normal Go backend (dependency injection). Turns out it’s not possible. Treating Vercel’s /api routes as a “mini Express server” is a fundamental architectural error. While both run Node.js, their underlying infrastructure (serverless vs. long‑running) dictates completely different rules.
Other services that work similarly are Netlify, Cloudflare Pages / Workers, and AWS Amplify.
1. Process Lifecycle: Ephemeral vs. Persistent
Traditional Backend (Long‑Running)
- Concept: The process starts and stays alive until you manually stop it or it crashes.
- Global root: Permanent; memory is allocated once and remains available across thousands of requests.
- Implication: You can rely on the server “remembering” things in its internal RAM for days or weeks.
Vercel (Serverless)
- Concept: The process is ephemeral. It is created when a request arrives and terminated (or frozen) the millisecond the response is sent.
- Global root: Destroyed frequently.
- Implication: Any internal state is wiped. Treat every request as if the server just performed a cold start.
2. State Management: The Global Variable Trap
Traditional Backend (Long‑Running)
- Memory: Global variables are shared. If User A updates a global counter, User B sees the updated value because they hit the same process.
- Garbage collection: As long as the process lives, GC ignores global variables because they are reachable, allowing in‑memory caching or simple counters.
Vercel (Serverless)
- Memory: Global variables are isolated. If 10 users hit your API, Vercel may spin up 10 separate execution environments (instances).
- Trap: User A updates a counter in Instance #1. User B hits Instance #2, which has its own global root where the counter is still 0.
- Note: A warm instance can retain globals between sequential requests, but idle instances revert to a cold state and lose all globals.
- Solution: Do not rely on in‑memory data for persistence. Use an external store such as Redis or PostgreSQL.
3. Concurrency: Thread Pooling vs. Horizontal Scaling
Traditional Backend (Long‑Running)
- Mechanism: One server handles multiple concurrent requests using a single thread with an event loop (Node.js) and manages a pool of connections.
- Scaling: High traffic makes the server work harder; if overwhelmed, requests queue up.
Vercel (Serverless)
- Mechanism: Vercel scales by multiplication—spawning many copies of the function when traffic spikes.
- Challenge: If 1,000 functions start simultaneously, each may open a new database connection, risking connection exhaustion and crashing the DB.
- Solution: Use a connection pooler (e.g., Prisma Accelerate or Supabase pooling).
4. Execution Timing: Synchronous vs. Background Tasks
Traditional Backend (Long‑Running)
- You can send a response and continue running code in the background (e.g.,
res.send(); fireAndForgetEmail();). - The process stays alive, allowing the background task to finish.
Vercel (Serverless)
- Flow: The moment
res.send()is called, the runtime environment is paused or terminated. - Disaster: Background tasks started after the response may be cut off mid‑execution, causing incomplete or missing work (e.g., half‑sent emails).
- Solution: Finish all work before responding, or delegate long‑running work to a queue service such as Upstash Workflow.
5. Persistent Connections: WebSockets vs. HTTP
Traditional Backend (Long‑Running)
- Connectivity: Supports stateful connections. A client can keep a TCP socket open (WebSockets) for real‑time chat or live updates; the server process remains available to hold the other end.
Vercel (Serverless)
- Connectivity: Supports only stateless HTTP requests. The function dies after the response, so it cannot keep a connection open.
- Result: Using
socket.ioor similar will fail because the “server” disappears after each request. - Solution: Use a third‑party real‑time‑as‑a‑service solution such as Pusher or Ably.
This article was created with AI assistance as a personal note.