Event-Driven Architecture 101: Building a Simple App with Kafka - By Gopi Gugan
Source: Dev.to
What Is Kafka (In Plain English)?
Apache Kafka is a distributed event streaming platform used to:
- Publish events (producers)
- Store events durably (topics)
- Consume events (consumers)
Instead of services calling each other directly, they emit events to Kafka. Other services react to those events asynchronously, when they are ready.
Think of Kafka as a highly reliable, scalable event log.
When Should You Use Kafka?
Kafka is a strong choice when you need:
- Asynchronous communication between services
- High‑throughput data pipelines
- Real‑time processing
- Decoupled microservices
You probably do not need Kafka if:
- You only have one service
- You rely on simple request/response APIs
- Your scale is small and predictable
Kafka is powerful — but unnecessary complexity is still complexity.
High‑Level Architecture
At a high level, Kafka works like this:
- A producer sends an event to a topic.
- Kafka stores the event durably.
- One or more consumers read the event at their own pace.
This eliminates tight coupling and prevents cascading failures between services.
Step 1: Run Kafka Locally with Docker
The fastest way to get started is Docker.
# docker-compose.yml
version: "3"
services:
zookeeper:
image: confluentinc/cp-zookeeper
environment:
ZOOKEEPER_CLIENT_PORT: 2181
kafka:
image: confluentinc/cp-kafka
ports:
- "9092:9092"
environment:
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
Start Kafka:
docker compose up -d
You now have a working Kafka broker running locally.
Step 2: Create a Producer (Node.js)
A producer’s only responsibility is to emit events.
import { Kafka } from "kafkajs";
const kafka = new Kafka({
brokers: ["localhost:9092"],
});
const producer = kafka.producer();
async function sendEvent() {
await producer.connect();
await producer.send({
topic: "orders",
messages: [
{
value: JSON.stringify({
orderId: 123,
total: 49.99,
}),
},
],
});
await producer.disconnect();
}
sendEvent();
Key takeaway: The producer does not know who consumes the event.
Step 3: Create a Consumer
Consumers react to events independently.
import { Kafka } from "kafkajs";
const kafka = new Kafka({
brokers: ["localhost:9092"],
});
const consumer = kafka.consumer({ groupId: "billing-service" });
async function run() {
await consumer.connect();
await consumer.subscribe({ topic: "orders" });
await consumer.run({
eachMessage: async ({ message }) => {
const event = JSON.parse(message.value.toString());
console.log("Processing order:", event.orderId);
},
});
}
run();
You can now add additional services (shipping, analytics, notifications) without changing the producer.
Why Event‑Driven Architecture Scales
| Traditional Architecture | Event‑Driven Architecture |
|---|---|
| Tight coupling | Loose coupling |
| Synchronous calls | Asynchronous events |
| Hard to scale | Horizontally scalable |
| Fragile failures | Resilient systems |
Kafka acts like a shock absorber between services.
Common Kafka Mistakes to Avoid
- Treating Kafka like a queue (it is a log)
- Creating too many tiny topics
- Ignoring schema evolution
- Using Kafka when a database and cron job would be simpler
Kafka should reduce complexity — not add to it.
When Kafka Becomes a Superpower
Kafka really shines when combined with:
- Schema Registry (Avro or Protobuf)
- Stream processing (Kafka Streams or Flink)
- Real‑time analytics pipelines
- Event‑driven notifications
At that point, Kafka becomes your system’s central nervous system.
Kafka is not scary — it is just a durable event log with rules. If you understand:
- Topics
- Producers
- Consumers
- Consumer groups
You already understand most of Kafka.