Node.js vs FastAPI Event Loop: A Deep Dive into Async Concurrency
Source: Dev.to
API Stacks in Production
Two stacks repeatedly appear in modern production systems:
- Node.js
- FastAPI
Both are widely known for handling high‑concurrency workloads and rely heavily on event‑driven architectures.
Quick Comparison
| Feature | Node.js | FastAPI |
|---|---|---|
| Concurrency model | Single‑threaded event loop + non‑blocking I/O | Async coroutines + multiple worker processes |
| Typical runtime | JavaScript (V8) | Python (asyncio) |
| Scales across cores | Requires cluster mode | Natural via multiple processes |
| Best for | I/O‑heavy APIs, real‑time apps, WebSockets | APIs, ML inference back‑ends, streaming, microservices |
| Potential bottleneck | CPU‑heavy work blocks the event loop | GIL is avoided by using separate processes |
Node.js
Core Philosophy
- Executes JavaScript on a single thread (mirroring the browser model).
- I/O operations are delegated to libuv, a high‑performance C library.
- When I/O completes, callbacks are pushed back into the event loop.
This design works extremely well for I/O‑heavy workloads, which most web servers are.
The trade‑off: CPU‑heavy work blocks the JavaScript thread, stalling the whole server.
Important Characteristics
- All JavaScript code runs on one thread.
- I/O → delegated to libuv (network, filesystem, timers, thread‑pool scheduling).
- Event loop phases:
| Phase | What Happens |
|---|---|
| Timers | Executes callbacks scheduled via setTimeout() / setInterval(). |
| Pending Callbacks | Processes system‑level callbacks (e.g., TCP errors). |
| Poll | Heart of the loop – waits for I/O events, retrieves completed operations, runs callbacks. |
| Check | Executes setImmediate() callbacks. |
| Close Callbacks | Handles cleanup events (e.g., socket.destroy()). |
Thread Pool (libuv)
- Handles file‑system access, DNS lookups, compression, cryptography, etc.
- Default size: 4 threads (configurable via
UV_THREADPOOL_SIZE). - JavaScript execution remains single‑threaded – heavy CPU tasks still block the loop.
Simple Example (Node.js)
// Callback style (old)
db.query(sql, (err, result) => {
if (err) throw err;
console.log(result);
});
// Promise / async‑await (modern)
const result = await db.query(sql);
console.log(result);
Even with
async/await, Node still schedules operations via callback queues; the syntax is syntactic sugar over promises and callbacks.
FastAPI
Core Philosophy
- FastAPI is just a framework; the runtime comes from ASGI servers such as:
uvicornhypercorngunicorn + uvicorn workers
- Each worker is:
- A separate OS process
- Its own Python interpreter
- Its own event loop
Concurrency therefore comes from two layers:
- Async coroutines inside each worker.
- Multiple worker processes across CPU cores.
Typical Endpoint
from fastapi import FastAPI
app = FastAPI()
@app.get("/users")
async def get_users():
result = await db.fetch_all()
return result
- Python creates a coroutine object (a pausable computation).
- The event loop schedules and resumes coroutines as needed.
- This yields a linear, readable programming model compared to callback hell.
Callback vs Coroutine Model
| Aspect | Node.js (Callbacks) | FastAPI (Coroutines) |
|---|---|---|
| Syntax | db.query(sql, (err, result) => {...}) | result = await db.query() |
| Flow | Nested callbacks → “callback hell” (mitigated by promises/async‑await) | Pausable functions → natural flow |
| Scheduling | Callback queues (event loop) | Cooperative scheduling of coroutines |
Process‑Based Parallelism
- FastAPI deployments typically run multiple worker processes:
- Each worker → 1 OS process → 1 Python interpreter → 1 event loop
- This enables true parallelism across CPU cores, sidestepping Python’s Global Interpreter Lock (GIL) because each process has its own interpreter.
Worker 1 → CPU core 1
Worker 2 → CPU core 2
...
Concurrency Strategies Summarized
Node.js Concurrency Model
- Single‑threaded event loop + non‑blocking I/O.
- Excels at:
- APIs
- Streaming services
- Real‑time applications (WebSockets, chat servers, live dashboards)
- Strengths:
- High I/O throughput
- Real‑time systems
- Massive ecosystem & mature tooling
- Weaknesses:
- CPU‑heavy tasks block the loop (e.g., large JSON parsing, image processing, encryption, ML inference)
- Requires worker threads or separate microservices for heavy CPU work.
FastAPI Concurrency Model
- Async coroutines + multiple worker processes.
- Enables responsive handling even when a worker is blocked by CPU work (other workers keep serving).
- Strengths:
- Seamless integration with machine‑learning frameworks
- Strong type hints & automatic validation via Pydantic
- Excellent developer ergonomics
- Strong performance for both I/O‑bound and CPU‑bound workloads (via process scaling)
- Common Misunderstanding: FastAPI does not automatically run everything asynchronously; you must explicitly use
async defand await I/O‑bound calls.
Takeaways
- Node.js shines when the workload is I/O‑bound and you can keep JavaScript execution lightweight.
- FastAPI provides a more flexible model for mixed workloads, leveraging Python’s rich ecosystem and process‑level parallelism.
- Choose the stack that aligns with your performance profile, team expertise, and operational constraints.
Clarifying Asynchronous Endpoints in FastAPI
Claim: “That is not true.”
The Reality
def endpoint():
...
- This is not asynchronous.
- FastAPI will run it inside a thread‑pool executor.
Result:
Blocking function → Executed in thread pool → Event loop remains free
To use the event loop directly, the endpoint must be declared as:
async def endpoint():
...
Understanding the Full Stack Layers
Node.js
V8 Engine → Node.js Runtime → libuv → Operating System
- V8 executes JavaScript.
- libuv handles async I/O.
FastAPI
Python Interpreter → FastAPI Framework → ASGI → Uvicorn Server → asyncio / uvloop → Operating System
- Many FastAPI deployments use uvloop, an event‑loop implementation written in C.
- uvloop itself is built on top of libuv — the same library used by Node.js.
Resulting equivalence
FastAPI + uvloop = Python + libuv
Different language, same underlying asynchronous engine.
Concurrency Philosophy
Both Node.js and FastAPI follow the same fundamental philosophy:
- Event‑driven, non‑blocking I/O
But they scale concurrency in different ways.
| Feature | Node.js | FastAPI |
|---|---|---|
| Execution model | Single JavaScript thread | asyncio coroutines |
| I/O handling | libuv thread pool | asyncio / uvloop |
| Parallelism | Cluster mode or worker threads | Multiple worker processes |
| Multi‑core scalability | Requires explicit clustering | Natural multi‑core support |
Practical Guidance
- Use Node.js when building real‑time systems and staying within the JavaScript ecosystem.
- Use FastAPI when building services around Python, data science, machine learning, or AI workloads.
Key takeaway: Understanding how each concurrency model behaves under load is crucial, because early architectural decisions shape how systems scale later.
Your Thoughts
Which runtime do you prefer for high‑concurrency APIs, and why?
Further Reading
If you enjoyed this article, you can read more of my blogs here: