Building a Kafka Event-Driven Spring Boot Application with Avro, Schema Registry and PostgreSQL
Source: Dev.to

Introduction
If you’re building event‑driven systems with Apache Kafka, you must think about data contracts early. This post shows a practical, end‑to‑end Spring Boot example using:
- Apache Kafka
- Confluent Schema Registry
- Avro serialization
- PostgreSQL
- Docker Compose
Full source code is provided in the repository linked at the end of the article.
Why Schema Registry + Avro?
JSON works… until it doesn’t. Common problems in Kafka‑based systems:
- Breaking consumers when producers change payloads
- No schema versioning
- Unclear data contracts between teams
Avro + Schema Registry solves this by:
- Enforcing schema compatibility
- Allowing safe schema evolution
- Decoupling producers from consumers
Architecture Overview
Client (Postman)
|
v
Spring Boot Producer (REST)
|
v
Kafka Topic (users.v1)
|
v
Spring Boot Consumer
|
v
PostgreSQL
- Producer exposes
POST /users - Payload is converted to an Avro record
- Message is published to Kafka
- Consumer deserializes Avro and persists data to PostgreSQL
What This Demo Includes
- Spring Boot Kafka Producer (Avro)
- Spring Boot Kafka Consumer (Avro)
- Confluent Schema Registry
- PostgreSQL persistence using Spring Data JPA
- Schema evolution with backward compatibility
- Docker Compose for local development
Local Setup (Kafka + Schema Registry + PostgreSQL)
Prerequisites
- Java 21
- Maven
- Docker & Docker Compose
Start infrastructure
docker compose up -d
Services started:
- Kafka →
localhost:29092 - Schema Registry → (address omitted in original)
- PostgreSQL →
localhost:5432
Run the Applications
Consumer
cd consumer-app
mvn spring-boot:run
The consumer listens to users.v1 and persists messages to PostgreSQL.
Producer
cd producer-app
mvn spring-boot:run
The producer exposes a REST endpoint.
Produce an Event
curl -X POST http://localhost:8080/users \
-H "Content-Type: application/json" \
-d '{
"id": "u-1",
"email": "user@test.com",
"firstName": "John",
"lastName": "Doe",
"isActive": true,
"age": 30
}'
You’ll see:
- Avro schema registered (or validated)
- Message published to Kafka
- Consumer saving the record to PostgreSQL
Schema Evolution (The Important Part)
Avro allows safe evolution when rules are respected.
Typical steps:
- Add a new optional field.
- Provide a default value.
- Keep compatibility set to BACKWARD.
Schema Registry ensures:
- Old consumers keep working.
- New producers don’t break the system.
This demo is designed to show real‑world schema evolution, not toy examples.
Confluent Cloud Ready
The project also supports Confluent Cloud via Spring profiles:
- SASL/SSL
- Schema Registry API keys
use.latest.version=trueauto.register.schemas=false
Perfect for CI/CD pipelines.
Source Code
GitHub repository includes:
- Docker Compose configuration
- Avro schemas
- Producer & consumer applications
- PostgreSQL setup
- Postman collection
Who Is This For?
- Java & Spring Boot developers
- Kafka users moving beyond JSON
- Teams building event‑driven microservices
- Anyone learning Schema Registry + Avro
Final Thoughts
This is a production‑style Kafka example, not a simple hello‑world. If you’re serious about:
- Schema contracts
- Backward compatibility
- Safe evolution
- Real persistence
then this demo will save you a lot of trial and error.
Feel free to star the repository if it helped you, or fork it and adapt it to your own system.