Building a New Poker Variant with an AWS Serverless Architecture
Source: Dev.to
How I designed and built a real‑time multiplayer game platform using AWS serverless infrastructure
Designing Before Coding
Before writing any code I produced a 36‑page Low‑Level Design (LLD) document. The LLD forced me to consider every edge case, state transition, and failure mode up front—exactly how production systems are built.
By the numbers
- 7 AWS services
- 8 DynamoDB tables
- 36‑page LLD
- 0 servers to manage
The Architecture
Point Game is a fully serverless, event‑driven system. Clients use REST APIs for account and table operations and WebSockets for real‑time gameplay.
Infrastructure Overview
| Component | Technology |
|---|---|
| CDN & Static Assets | CloudFront + S3 |
| API Layer | API Gateway (REST + WebSocket) |
| Compute | AWS Lambda |
| Database | DynamoDB |
| Authentication | Cognito |
| Scheduling | EventBridge |
The key insight: DynamoDB is the single source of truth. Every game state, action log, and connection mapping lives in DynamoDB. Lambda functions are stateless—they read state, process actions, write state, and broadcast. This makes the system horizontally scalable and resilient to failures.
Client Action → API Gateway → Lambda → DynamoDB → Broadcast
Data Model
| Table | Purpose |
|---|---|
| Game State | Current hand state, seats, pots, board |
| Action Log | Append‑only record of every action |
| Hand Snapshots | End‑of‑hand state for replay/audit |
| Connection Store | WebSocket ID → Player mapping |
| Turn Timers | Scheduled timeout tracking |
| Inter‑Round Queue | Pending join/leave/config actions |
| Users | Account data and balances |
| Ledger | Buy‑in/cash‑out history |
The Hard Problems
Challenge 1: Optimistic Concurrency Control
Simultaneous player actions can corrupt state. I implemented sequence‑based versioning: each state mutation includes an expected sequence number. If the number does not match, the write fails and the client resynchronizes, eliminating race conditions and lost actions.
Challenge 2: Turn Timer System
Lambda functions cannot “wait.” I used EventBridge scheduled events: when a turn starts, a future event with a timer sequence is scheduled. The timeout Lambda checks whether the sequence is still current; if the player has already acted, the timer is ignored, otherwise the player is auto‑folded.
Challenge 3: Privacy‑Filtered Broadcasting
Each player must see a personalized view (own hole cards vs. opponents’ card backs). The broadcaster loads the authoritative state, filters out private information per recipient, and sends a tailored WebSocket message.
Challenge 4: Inter‑Round Action Queue
Players may join, leave, or change settings mid‑hand. These actions are stored in a queue and processed atomically between hands, preserving game state consistency while accommodating real‑world player behavior.
Challenge 5: Complex Game Rules
Point Game includes many edge cases—especially showdown logic involving side pots, split pots, and multiple winners. Translating these rules into reliable code required careful state management under concurrent load, all without dedicated servers.
Why This Exceeds Industry Standard
Most hobby projects are simple CRUD apps or static pages. Point Game demonstrates:
- A real‑time distributed system with WebSocket state synchronization
- Event‑driven architecture with scheduled triggers and async processing
- Production‑grade consistency guarantees via optimistic concurrency
- Domain‑specific game logic handling complex state machines for multiple simultaneous players and games
I designed, documented, and built the entire platform and can explain every architectural decision.
Experience It Yourself
Play the game:
Join the community: