Implementing the Bulkhead Pattern in Node.js

Published: (December 16, 2025 at 06:41 AM EST)
3 min read
Source: Dev.to

Source: Dev.to

Introduction to System Resilience

In a distributed system or a microservices architecture, a single failing component can cause a ripple effect that brings down an entire application. This is often seen when a database or downstream service becomes slow. Incoming requests continue to stack up, consuming memory and CPU cycles until the Node.js event loop is exhausted.

The Bulkhead pattern is a structural design used to isolate critical resources. By limiting the number of concurrent operations allowed to reach a specific resource (like a database), we ensure that a spike in traffic or a slowdown in the database does not consume all available server resources.

A semaphore is the synchronization primitive used to implement a Bulkhead. Unlike a mutex, which allows only one task to proceed at a time, a semaphore allows a defined number of concurrent tasks (N).

In a Node.js context, we use a semaphore to manage:

  • Counter: Tracks how many asynchronous operations are currently in progress.
  • Queue: Stores the “resolve” functions of promises for tasks that arrived after the concurrency limit was reached.
  • Wait (P) / Signal (V): Operations that respectively request a slot or release a slot.

The class requires a concurrencyLimit to define the maximum simultaneous operations and a queueLimit to prevent memory exhaustion from an infinite waiting line.

Bulkhead Implementation

class Bulkhead {
  constructor(concurrencyLimit, queueLimit = 100) {
    this.concurrencyLimit = concurrencyLimit;
    this.queueLimit = queueLimit;
    this.activeCount = 0;
    this.queue = [];
  }

  async run(task) {
    // Admission Control and Wait Logic
    if (this.activeCount >= this.concurrencyLimit) {
      if (this.queue.length >= this.queueLimit) {
        throw new Error("Bulkhead capacity exceeded: Server Busy");
      }

      await new Promise((resolve) => {
        this.queue.push(resolve);
      });
    }

    this.activeCount++;

    try {
      // Execution of the asynchronous task
      return await task();
    } finally {
      // Release Logic
      this.activeCount--;
      if (this.queue.length > 0) {
        const nextInLine = this.queue.shift();
        nextInLine();
      }
    }
  }
}

Using the Bulkhead with Mongoose

const dbBulkhead = new Bulkhead(5, 10);

app.get('/data', async (req, res) => {
  try {
    const result = await dbBulkhead.run(() => User.find().lean());
    res.json(result);
  } catch (error) {
    res.status(503).json({ message: error.message });
  }
});

The database call is wrapped in the run method. This ensures that even if 1,000 requests hit the API endpoint simultaneously, only a controlled number actually hit the database.

Key Technical Takeaways

  • Fail‑Fast (Admission Control): By checking queue.length, we reject requests immediately (HTTP 503) rather than letting them hang and consume RAM.
  • Error Isolation: The finally block guarantees that activeCount is decremented even if a database query fails, allowing the next queued task to proceed.
  • Resource Management: In microservices, the concurrency limit should be calculated as Total DB Connections / Number of Service Instances.

💡 Have questions? Drop them in the comments!

Back to Blog

Related posts

Read more »