Handling 10,000+ database queries during build: A Node.js connection pooling story

Published: (January 13, 2026 at 12:56 PM EST)
5 min read
Source: Dev.to

Source: Dev.to

Overview

  • PostgreSQL database?
  • Prisma ORM?
  • Node.js static site generation?

FATAL: too many connections for role "role_xxxxx"

Prisma Accelerate has built‑in connection pooling to prevent such errors.

My build process was generating 70+ pages, each requiring multiple database queries. The math was simple and brutal:

70 pages × 4 queries per page × concurrent execution = connection‑pool explosion.

The Problem: Node.js Event Loop vs. Database Connections

Most tutorials don’t tell you this: Node.js is too good at concurrency. When you generate static pages, Node.js doesn’t wait politely for one page to finish before starting the next—it fires off all page generations simultaneously. Each page needs to:

  1. Query products from the database
  2. Query blog posts for related content
  3. Query category data
  4. Generate dynamic sitemap entries

That’s potentially 280+ concurrent database connections hitting a pool limited to 5‑10 connections.

Result: Build failures, timeout errors, and a lot of frustration.

Why Standard Solutions Failed

Attempt 1: Increase Connection Limit

DATABASE_URL="postgresql://user:pass@host:5432/db?connection_limit=20"

Result: Still failed. The problem wasn’t the limit—it was the rate of connection creation.

Attempt 2: Manual Connection Management

await prisma.$connect();
const data = await prisma.product.findMany();
await prisma.$disconnect();

Result: Made it worse. I was creating and destroying connections rapidly, overwhelming the DB server.

Attempt 3: Prisma’s Built‑in Pooling

const prisma = new PrismaClient();

Result: Not enough. Prisma’s singleton pattern works great for runtime requests, but static generation’s concurrent nature bypassed it.

The Solution: Query Queuing in Node.js

I needed to serialize database operations during the build while keeping them concurrent at runtime. The pattern that worked:

// lib/db.ts
import { PrismaClient } from '@prisma/client';

const globalForPrisma = globalThis as unknown as {
  prisma: PrismaClient | undefined;
  queryQueue: Promise<any> | undefined;
};

const createPrismaClient = () => {
  return new PrismaClient({
    log:
      process.env.NODE_ENV === 'development' ? ['error', 'warn'] : ['error'],
  });
};

export const prisma = globalForPrisma.prisma ?? createPrismaClient();

if (process.env.NODE_ENV !== 'production') {
  globalForPrisma.prisma = prisma;
}

/**
 * Serialize queries during the production build.
 * In dev/runtime the function runs immediately.
 */
const queueQuery = async (fn: () => Promise<any>): Promise<any> => {
  if (
    process.env.NEXT_PHASE === 'phase-production-build' ||
    process.env.NODE_ENV === 'production'
  ) {
    const previousQuery = globalForPrisma.queryQueue ?? Promise.resolve();
    const currentQuery = previousQuery
      .then(() => fn())
      .catch(() => fn()); // ensure the chain continues on error
    globalForPrisma.queryQueue = currentQuery;
    return currentQuery;
  }
  return fn();
};

export async function withPrisma(
  callback: (prisma: PrismaClient) => Promise<any>
): Promise<any> {
  return queueQuery(async () => {
    try {
      return await callback(prisma);
    } catch (error) {
      console.error('Database operation failed:', error);
      throw error;
    }
  });
}

Key Insight

Use a promise chain as a queue. Each query waits for the previous one to finish during the build, but executes immediately at runtime.

How It Works

ContextExecution Flow
Build (Production)Query 1 → Query 2 → Query 3 → … (serial)
Runtime (Server)Query 1 ↘
Query 2 → Query 3 → … (concurrent)

globalForPrisma.queryQueue acts as a checkpoint. Each new query:

  1. Waits for the previous promise to resolve.
  2. Executes its DB operation.
  3. Becomes the new checkpoint for the next query.

Implementation Across the Codebase

Sitemap Generation

// app/sitemap.ts
export default async function sitemap() {
  const baseUrl = process.env.BASE_URL ?? 'https://example.com';
  const staticPages = [
    { url: `${baseUrl}/`, priority: 1 },
    // …other static routes
  ];

  // ---- Products -------------------------------------------------
  let productPages: { url: string; lastModified: Date; priority: number }[] =
    [];
  try {
    const products = await getAllAmazonProducts({ limit: 1000 });
    productPages = products.map((p) => ({
      url: `${baseUrl}/products/${p.slug}`,
      lastModified: p.updatedAt,
      priority: p.featured ? 0.85 : 0.75,
    }));
  } catch (error) {
    console.error('Error fetching products for sitemap:', error);
  }

  // ---- Blog posts -----------------------------------------------
  let blogPages: { url: string; lastModified: Date; priority: number }[] = [];
  try {
    const posts = await prisma.blogPost.findMany({
      where: { status: 'published' },
    });
    blogPages = posts.map((post) => ({
      url: `${baseUrl}/blog/${post.slug}`,
      lastModified: post.updatedAt,
      priority: 0.65,
    }));
  } catch (error) {
    console.error('Error fetching blog posts for sitemap:', error);
  }

  return [...staticPages, ...productPages, ...blogPages];
}

Each try‑catch block handles failures gracefully—if a DB call fails, that section is simply skipped instead of crashing the whole build.

Production Results

MetricBeforeAfter
Build success rate~30 %100 %
Average build time~2 min (unstable)45 s (consistent)
Connection‑timeout errorsFrequentZero
Pages generated70 + (often failed)70 + (all succeed)

You can see this architecture handling production traffic at elyvora.us – 70+ database‑driven pages, all built without a single connection error.

Lessons Learned

  1. Environment‑specific behavior matters – Don’t assume build‑time behavior matches runtime behavior. Node.js behaves differently in each context.
  2. Global state isn’t always evil – Using globalThis to maintain a promise queue across module boundaries solves the “multiple‑instance” problem during static generation.
  3. Serializing DB work during a build can save you – A simple promise‑chain queue prevents connection‑pool exhaustion without sacrificing runtime concurrency.

Happy building, and may your connection pools stay happy!

Graceful Degradation Over Perfection

My sitemap has fallback logic. If product queries fail, it still generates static pages. Partial success > complete failure.

Log Everything During Build

console.log(`✅ Sitemap generated with ${allPages.length} URLs`);
console.log(`   - Product pages: ${productPages.length}`);
console.log(`   - Blog pages: ${blogPages.length}`);

These logs saved hours of debugging. You can’t attach a debugger to a build process, so stdout is your best friend.

When to Use This Pattern

This approach works best when you have:

  • Static site generation with database‑backed content
  • High page counts (50 + pages)
  • Limited database connection pools (shared hosting, free tiers)
  • Multiple queries per page (complex data relationships)

For runtime‑only applications (pure API servers), stick with standard connection pooling.

Conclusion

Node.js’s async nature is usually a superpower. But during static generation with database dependencies, it becomes a challenge. The solution isn’t to fight Node.js’s concurrency—it’s to control when concurrency happens. By serializing queries during the build and keeping them concurrent at runtime, you get the best of both worlds: reliable builds and fast server responses.

Update: This pattern has been running in production for 3 + weeks at elyvora.us with zero connection issues. 70 + pages rebuild every hour without breaking a sweat.

Back to Blog

Related posts

Read more »