Node.js vs FastAPI Event Loop: A Deep Dive into Async Concurrency

Published: (March 4, 2026 at 04:34 PM EST)
6 min read
Source: Dev.to

Source: Dev.to

API Stacks in Production

Two stacks repeatedly appear in modern production systems:

  • Node.js
  • FastAPI

Both are widely known for handling high‑concurrency workloads and rely heavily on event‑driven architectures.

Quick Comparison

FeatureNode.jsFastAPI
Concurrency modelSingle‑threaded event loop + non‑blocking I/OAsync coroutines + multiple worker processes
Typical runtimeJavaScript (V8)Python (asyncio)
Scales across coresRequires cluster modeNatural via multiple processes
Best forI/O‑heavy APIs, real‑time apps, WebSocketsAPIs, ML inference back‑ends, streaming, microservices
Potential bottleneckCPU‑heavy work blocks the event loopGIL is avoided by using separate processes

Node.js

Core Philosophy

  • Executes JavaScript on a single thread (mirroring the browser model).
  • I/O operations are delegated to libuv, a high‑performance C library.
  • When I/O completes, callbacks are pushed back into the event loop.

This design works extremely well for I/O‑heavy workloads, which most web servers are.
The trade‑off: CPU‑heavy work blocks the JavaScript thread, stalling the whole server.

Important Characteristics

  • All JavaScript code runs on one thread.
  • I/O → delegated to libuv (network, filesystem, timers, thread‑pool scheduling).
  • Event loop phases:
PhaseWhat Happens
TimersExecutes callbacks scheduled via setTimeout() / setInterval().
Pending CallbacksProcesses system‑level callbacks (e.g., TCP errors).
PollHeart of the loop – waits for I/O events, retrieves completed operations, runs callbacks.
CheckExecutes setImmediate() callbacks.
Close CallbacksHandles cleanup events (e.g., socket.destroy()).

Thread Pool (libuv)

  • Handles file‑system access, DNS lookups, compression, cryptography, etc.
  • Default size: 4 threads (configurable via UV_THREADPOOL_SIZE).
  • JavaScript execution remains single‑threaded – heavy CPU tasks still block the loop.

Simple Example (Node.js)

// Callback style (old)
db.query(sql, (err, result) => {
  if (err) throw err;
  console.log(result);
});

// Promise / async‑await (modern)
const result = await db.query(sql);
console.log(result);

Even with async/await, Node still schedules operations via callback queues; the syntax is syntactic sugar over promises and callbacks.

FastAPI

Core Philosophy

  • FastAPI is just a framework; the runtime comes from ASGI servers such as:
    • uvicorn
    • hypercorn
    • gunicorn + uvicorn workers
  • Each worker is:
    • A separate OS process
    • Its own Python interpreter
    • Its own event loop

Concurrency therefore comes from two layers:

  1. Async coroutines inside each worker.
  2. Multiple worker processes across CPU cores.

Typical Endpoint

from fastapi import FastAPI

app = FastAPI()

@app.get("/users")
async def get_users():
    result = await db.fetch_all()
    return result
  • Python creates a coroutine object (a pausable computation).
  • The event loop schedules and resumes coroutines as needed.
  • This yields a linear, readable programming model compared to callback hell.

Callback vs Coroutine Model

AspectNode.js (Callbacks)FastAPI (Coroutines)
Syntaxdb.query(sql, (err, result) => {...})result = await db.query()
FlowNested callbacks → “callback hell” (mitigated by promises/async‑await)Pausable functions → natural flow
SchedulingCallback queues (event loop)Cooperative scheduling of coroutines

Process‑Based Parallelism

  • FastAPI deployments typically run multiple worker processes:
    • Each worker → 1 OS process1 Python interpreter1 event loop
  • This enables true parallelism across CPU cores, sidestepping Python’s Global Interpreter Lock (GIL) because each process has its own interpreter.
Worker 1 → CPU core 1
Worker 2 → CPU core 2
...

Concurrency Strategies Summarized

Node.js Concurrency Model

  • Single‑threaded event loop + non‑blocking I/O.
  • Excels at:
    • APIs
    • Streaming services
    • Real‑time applications (WebSockets, chat servers, live dashboards)
  • Strengths:
    • High I/O throughput
    • Real‑time systems
    • Massive ecosystem & mature tooling
  • Weaknesses:
    • CPU‑heavy tasks block the loop (e.g., large JSON parsing, image processing, encryption, ML inference)
    • Requires worker threads or separate microservices for heavy CPU work.

FastAPI Concurrency Model

  • Async coroutines + multiple worker processes.
  • Enables responsive handling even when a worker is blocked by CPU work (other workers keep serving).
  • Strengths:
    • Seamless integration with machine‑learning frameworks
    • Strong type hints & automatic validation via Pydantic
    • Excellent developer ergonomics
    • Strong performance for both I/O‑bound and CPU‑bound workloads (via process scaling)
  • Common Misunderstanding: FastAPI does not automatically run everything asynchronously; you must explicitly use async def and await I/O‑bound calls.

Takeaways

  • Node.js shines when the workload is I/O‑bound and you can keep JavaScript execution lightweight.
  • FastAPI provides a more flexible model for mixed workloads, leveraging Python’s rich ecosystem and process‑level parallelism.
  • Choose the stack that aligns with your performance profile, team expertise, and operational constraints.

Clarifying Asynchronous Endpoints in FastAPI

Claim: “That is not true.”

The Reality

def endpoint():
    ...
  • This is not asynchronous.
  • FastAPI will run it inside a thread‑pool executor.

Result:

Blocking function → Executed in thread pool → Event loop remains free

To use the event loop directly, the endpoint must be declared as:

async def endpoint():
    ...

Understanding the Full Stack Layers

Node.js

V8 Engine → Node.js Runtime → libuv → Operating System
  • V8 executes JavaScript.
  • libuv handles async I/O.

FastAPI

Python Interpreter → FastAPI Framework → ASGI → Uvicorn Server → asyncio / uvloop → Operating System
  • Many FastAPI deployments use uvloop, an event‑loop implementation written in C.
  • uvloop itself is built on top of libuv — the same library used by Node.js.

Resulting equivalence

FastAPI + uvloop  =  Python + libuv

Different language, same underlying asynchronous engine.

Concurrency Philosophy

Both Node.js and FastAPI follow the same fundamental philosophy:

  • Event‑driven, non‑blocking I/O

But they scale concurrency in different ways.

FeatureNode.jsFastAPI
Execution modelSingle JavaScript threadasyncio coroutines
I/O handlinglibuv thread poolasyncio / uvloop
ParallelismCluster mode or worker threadsMultiple worker processes
Multi‑core scalabilityRequires explicit clusteringNatural multi‑core support

Practical Guidance

  • Use Node.js when building real‑time systems and staying within the JavaScript ecosystem.
  • Use FastAPI when building services around Python, data science, machine learning, or AI workloads.

Key takeaway: Understanding how each concurrency model behaves under load is crucial, because early architectural decisions shape how systems scale later.

Your Thoughts

Which runtime do you prefer for high‑concurrency APIs, and why?

Further Reading

If you enjoyed this article, you can read more of my blogs here:

https://writings.dipchakraborty.com

0 views
Back to Blog

Related posts

Read more »

Runs vs. Threads: When to Use Which

markdown !Cover image for Runs vs. Threads: When to Use Whichhttps://media2.dev.to/dynamic/image/width=1000,height=420,fit=cover,gravity=auto,format=auto/https%...

Stop Building API Dashboards From Scratch

Every API developer has been there. You ship an API, someone starts using it, and the questions begin: - “How many requests are we getting?” - “Who’s our heavie...