Async Doesn’t Mean Infinite Concurrency in Node.js
Source: Dev.to
The misconception about async concurrency
Node.js makes concurrency easy with async/await and Promises, but a dangerous misconception often arises:
If it’s async, you can run unlimited operations safely.
This is not true. Unbounded concurrency can silently degrade your system.
Why unbounded concurrency is problematic
Consider this common pattern:
await Promise.all(events.map(publishEvent));
If events.length is 10,000, this creates 10,000 concurrent operations. Such a burst can overwhelm:
- Database connection pools
- RabbitMQ / Kafka connections
- Downstream services
- Memory and event‑loop scheduling
async removes thread blocking, not resource limits. Each async operation still consumes resources:
- Network sockets
- File descriptors
- Memory
- Connection‑pool slots
- CPU time for callbacks
Consequences of too many concurrent promises
- Connection‑pool exhaustion
- Increased latency
- Timeouts
- Cascading failures
Ironically, trying to go faster can make the system slower.
Bounded concurrency as a solution
Process work in controlled parallel batches instead of launching everything at once.
Example: limit concurrency to 10 with p-limit
import pLimit from "p-limit";
const limit = pLimit(10);
await Promise.all(
events.map(event => limit(() => publishEvent(event)))
);
This approach guarantees:
- Maximum 10 concurrent operations
- Stable resource usage
- Predictable throughput
Typical scenarios where bounded concurrency shines
- Outbox event publishers
- Message consumers
- Bulk database operations
- API integrations
- Background workers
Conclusion
Concurrency should be intentional, not accidental. In production systems, bounded concurrency is often faster and more reliable than unbounded concurrency—because stability scales, chaos doesn’t.