I Built an Open-Source AI Tool That Turns Any Codebase Into Deep Engineering Documentation (Runs 100% Locally)

Published: (February 24, 2026 at 03:31 AM EST)
3 min read
Source: Dev.to

Source: Dev.to

Introducing KT Studio

KT Studio is an open‑source, local‑first web application that scans your repository and generates deep, structured engineering documentation using a local Ollama model.

  • No cloud services
  • No code uploads
  • No external APIs

Everything runs entirely on your machine.

GitHub:


What It Actually Does

KT Studio analyzes your project and generates a structured documentation site with:

  • ✅ Architecture overview (with Mermaid diagrams)
  • ✅ Quick‑start guide (real setup commands from your repo)
  • ✅ API reference (parsed from routes)
  • ✅ Database layer explanation
  • ✅ Environment variable breakdown
  • ✅ CI/CD & deployment notes
  • ✅ AI integrations detection (OpenAI, Ollama, LangChain, etc.)
  • ✅ Testing strategy
  • ✅ Troubleshooting guide
  • ✅ Ownership & risk areas

It reads your project structure and produces repo‑aware documentation, not a generic summary.


Why Local‑First AI?

Many teams can’t upload proprietary repositories to cloud AI tools. KT Studio:

  • Uses Node.js filesystem scanning
  • Skips node_modules, dist, and other build artifacts
  • Ignores .env, .pem, Terraform secrets
  • Redacts potential credentials automatically

Runs via Ollama at http://127.0.0.1:11434; your code never leaves your machine.


Tech Stack

  • Next.js (App Router) + TypeScript
  • Tailwind CSS + shadcn/ui
  • SQLite via Prisma
  • Ollama (default model: qwen3-coder:30b)
  • react-markdown + Mermaid.js
  • Server‑Sent Events (SSE) for real‑time streaming

Real Use Cases

Developer Onboarding

Generate a project blueprint before a new developer joins.

Knowledge Transfer

When a senior dev leaves, generate structured KT docs in minutes.

Codebase Audits

Understand inherited repositories faster.

AI‑Heavy Projects

Automatically detect:

  • Model usage
  • Prompt templates
  • Embedding pipelines
  • RAG‑style integrations

Consulting & Freelancing

Deliver solid documentation with project handoffs.


How to Run It

Prerequisites

  • Node.js v18+
  • Ollama running locally (http://127.0.0.1:11434)
  • An Ollama model installed (defaults to qwen3-coder:30b)

Install & Start

npm install
npx prisma db push
npx prisma generate
npm run dev

Open in your browser.

Generating Docs (Local Mode)

  1. Ensure Ollama is running.
  2. Click New Project.
  3. Choose a local folder and enter the absolute path.
  4. Click Start Generation.
  5. After it finishes, click View Documentation.

What’s Next?

Planned expansions:

  • Git repository import mode
  • Vector‑based semantic indexing
  • Incremental regeneration
  • Architecture diffing between versions
  • Plugin‑based documentation sections
  • Team collaboration mode

Looking for Collaborators

If you’re interested in:

  • AI‑powered developer tools
  • Local LLM systems
  • Repo intelligence
  • Developer productivity
  • Open‑source infrastructure

I’d love to collaborate.

Repo:


Bigger Vision

What if every repository could explain itself?
What if onboarding took hours instead of weeks?
What if knowledge transfer wasn’t dependent on memory?

That’s the problem KT Studio aims to solve. Let’s make codebases self‑explanatory.

0 views
Back to Blog

Related posts

Read more »

DevOps and Vibe Coding: A Journey

Things to Do Map Your Application - Map your application on paper, in a spreadsheet, or using graphics/flowcharts. This is the first step. - Understanding the...

OpenAI just raised $110 billion. Wow

Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment's permalink. Hide child comments as we...