I Built a Mini Derivatives Exchange in Python. Here's How I Used Cursor Without Letting It Run the Show.

Published: (March 1, 2026 at 03:05 PM EST)
4 min read
Source: Dev.to

Source: Dev.to

Motivation

This year I wanted to level up in AI‑assisted coding as a senior engineer while also exploring trading. I built a paper derivatives exchange that includes an order book, limit and market orders, a matching engine, and position tracking. The stack is Python, FastAPI, and PostgreSQL. I used Cursor throughout the project, but not in the “just type a prompt and get a full app” way.

Planning

One‑pager Architecture

I started by opening a document (not the IDE) and wrote a single architecture and projects file that covered:

  • Repository split – frontend and backend are always separate, one deployable per backend.
  • Repository responsibilities – exchange API, demo UI, and a placeholder for future gamification integration.
  • Tech choices – FastAPI, PostgreSQL, Binance public API for prices (no Kafka yet).
  • Module boundariescore, market_data, orders, matching, positions, integration, each with a clear job and a public interface.

That document became the plan that guided Cursor. When Cursor suggested putting the UI in the same repo as the API, I pointed back to the doc and said no. This is essentially Cursor’s Plan Mode: get the design and file‑level plan right before letting the agent write code.

Architecture

I chose a modular monolith: one service, one database, but strict module boundaries. Each module exposes a public interface (e.g., OrderService.place, OrderBook.add). No module imports another module’s internals.

Why a modular monolith?

  1. Keeps operations simple.
  2. Allows me to give Cursor very focused tasks, such as “Implement OrderService.place in the orders module and wire it to the matching engine,” instead of “build the whole trading flow.”

Implementation Details

Matching Engine

The matching engine is the heart of the system. I wanted price‑time priority, support for limit and market orders, and the ability to update positions and optionally ping a gamification API on fill.

I broke the implementation into small, reviewable steps:

  1. Data structures – Define Order, Fill, and an OrderBook (bids and asks by price level). I described the desired behavior and had Cursor implement methods such as add, insert, cancel, best_bid, and best_ask.
  2. Matching logic – When an order is placed, load the book from the DB (all open limit orders), run book.add(order), and receive a list of Fills. Cursor generated the code; I reviewed how fills were applied to orders and positions.
  3. Persistence – Orders and positions are stored in PostgreSQL. The book is rebuilt from the DB on every place and cancel operation, avoiding long‑lived in‑memory state. This trade‑off favors simplicity and correctness over ultra‑low latency for a prototype.

Each step was a focused prompt with clear success criteria. When something went wrong (e.g., the maker order not updating on fill), I reverted, tightened the plan, and reran the agent rather than endlessly patching in chat.

Keeping Prompts Manageable

I never pasted the entire codebase into a prompt. Instead, I kept the architecture doc and a README as the “rules.” I pointed Cursor at existing files, e.g., “Add the integration call where we apply fills, same pattern as in positions/service.py.” Cursor used grep and semantic search to locate where orders are saved and where positions are updated, keeping prompts short and the agent within the boundaries I set.

Result

  • Order book with limit and market orders.
  • Position tracking.
  • Real‑time market data from Binance (no API key required).
  • Optional POST to a gamification API on fill.
  • README containing an architecture diagram, system‑design notes, and an API table.
  • Docker Compose configuration for local development.

Production‑Ready Checklist

Before considering the project production‑ready, I would add:

  • Rate limiting (e.g., Redis).
  • Comprehensive unit tests for the matching engine (OrderBook.add, insert, cancel).
  • Integration tests for order placement, cancellation, and book reconstruction.

Takeaways

Using Cursor like a senior developer meant:

  1. Writing the architecture and module map first.
  2. Breaking implementation into small, reviewable steps.
  3. Giving the agent clear boundaries and concrete examples instead of a “build everything” request.
  4. Reverting and refining the plan whenever the output drifted from the spec.

The result is a codebase I can explain in an interview and a story about how I leveraged AI without letting it own the design.

  • Backend: mini-derivatives-exchange – contains the API, matching engine, and database layer.
  • Frontend: mini-exchange-ui – a demo UI that consumes the API.

Both repositories include READMEs with architecture diagrams, API documentation, and system‑design notes.

0 views
Back to Blog

Related posts

Read more »

Google Gemini Writing Challenge

What I Built - Where Gemini fit in - Used Gemini’s multimodal capabilities to let users upload screenshots of notes, diagrams, or code snippets. - Gemini gener...