I built a VS Code extension that lets you chat with your database - everything runs locally

Published: (March 9, 2026 at 08:39 PM EDT)
2 min read
Source: Dev.to

Source: Dev.to

I spent a week onboarding into a project with a SQL Server database I’d never seen before. Dozens of stored procedures, no documentation, and the previous dev had left. I kept thinking — why can’t I just ask this database what it does?

How it works under the hood

The extension runs a all‑MiniLM‑L6‑v2 model entirely in Node.js and stores all embeddings in LanceDB. This local‑first approach means no data ever leaves your machine.

A thing I got wrong early on

I initially set llama3.1:8b as the default model and added a warning if you pick something lighter. In many cases the “boring” solution (a smaller, faster model) is the right one.

Why local‑first

Running everything locally protects sensitive schema information and eliminates latency from round‑trips to external APIs. All processing, including LLM inference and vector storage, happens on your own hardware.

Getting started

Install and start Ollama

  1. Follow the official Ollama installation guide for your platform.

  2. Start the Ollama service:

    ollama serve

Install SchemaSight from the VS Code Marketplace

Search for SchemaSight in the VS Code Marketplace and click Install.

Add a connection and follow the onboarding flow

  1. Open the SchemaSight panel in VS Code.
  2. Click Add Connection and provide your database credentials.
  3. The onboarding wizard will:
    • Verify your Ollama installation.
    • Pull the selected LLM model.
    • Crawl the database schema and index it locally.

Caveat: The initial indexing step summarizes every schema object using your local LLM. On an M5 MacBook Pro with ~95 objects, this takes about 15–20 minutes. It’s a one‑time cost; you only need to re‑index when the schema changes or you switch models.

License & feedback

The source code is available on GitHub under the MIT license. Feel free to open issues or submit pull requests, and let me know your thoughts in the comments.

0 views
Back to Blog

Related posts

Read more »