How I bundle my codebase so ChatGPT can actually understand it

Published: (January 19, 2026 at 05:21 AM EST)
2 min read
Source: Dev.to

Source: Dev.to

Problem

LLM chat apps are great at answering questions—until you point them at a real codebase.
When a project grows past a certain size, context becomes the bottleneck: too many files, too much noise, not enough structure.

Typical LLM chat apps work best when:

  • Context is linear
  • Files are grouped by meaning, not size
  • References are precise (file + line)

Real repositories, however, are:

  • Hierarchical
  • Noisy
  • Full of things the model doesn’t need

The result is vague or hallucinated answers.

Solution: srcpack

Instead of feeding the whole repo, generate an LLM‑optimized snapshot:

  • Code is bundled by domain (e.g., web, api, docs, etc.)
  • Each bundle is indexed with file paths + line numbers
  • .gitignore is respected by default
  • No configuration needed to get started

The output consists of plain‑text files, easy to upload or attach to ChatGPT, Gemini, Claude, etc.
The tool is called srcpack.

Workflows where srcpack shines

  • Exploring large repos – Ask “where does auth actually live?” or “what touches billing?”
  • Avoiding context limits – Instead of pasting files manually, attach a focused bundle.
  • Sharing with non‑technical teammates – Upload an LLM‑friendly snapshot to Google Drive and share it.

Typical questions from PMs or designers:

  • “What shipped this week?”
  • “What’s still in progress?”
  • “Which parts are risky?”

srcpack acts like a lightweight, read‑only AI interface to the codebase.

Usage

npx srcpack   # or bunx srcpack

Zero configuration is required by default.

Documentation & Source

  • Docs:
  • GitHub:

Closing Thought

I don’t think this is the final answer to LLM + codebase interaction, but it’s been a very practical improvement for day‑to‑day work. I’m curious how others are handling large‑repo context with LLMs—especially in fast‑moving projects.

Back to Blog

Related posts

Read more »

Top 5 CLI Coding Agents in 2026

Introduction The command line has always been home turf for developers who value speed, clarity, and control. By 2026, AI has settled comfortably into that spa...