Show HN: LocalGPT – A local-first AI assistant in Rust with persistent memory

Published: (February 7, 2026 at 08:26 PM EST)
3 min read

Source: Hacker News

LocalGPT

A local device‑focused AI assistant built in Rust — persistent memory, autonomous tasks, ~27 MB binary. Inspired by and compatible with OpenClaw.

cargo install localgpt

Why LocalGPT?

  • Single binary — no Node.js, Docker, or Python required
  • Local device focused — runs entirely on your machine; your memory data stays yours
  • Persistent memory — markdown‑based knowledge store with full‑text and semantic search
  • Autonomous heartbeat — delegate tasks and let it work in the background
  • Multiple interfaces — CLI, web UI, desktop GUI
  • Multiple LLM providers — Anthropic (Claude), OpenAI, Ollama
  • OpenClaw compatible — works with SOUL, MEMORY, HEARTBEAT markdown files and skills format

Install

cargo install localgpt

Quick Start

# Initialize configuration
localgpt config init

# Start interactive chat
localgpt chat

# Ask a single question
localgpt ask "What is the meaning of life?"

# Run as a daemon with heartbeat, HTTP API and web UI
localgpt daemon start

How It Works

LocalGPT uses plain markdown files as its memory:

~/.localgpt/workspace/
├── MEMORY.md            # Long‑term knowledge (auto‑loaded each session)
├── HEARTBEAT.md         # Autonomous task queue
├── SOUL.md              # Personality and behavioral guidance
└── knowledge/           # Structured knowledge bank (optional)
    ├── finance/
    ├── legal/
    └── tech/

Files are indexed with SQLite FTS5 for fast keyword search, and sqlite-vec for semantic search with local embeddings.

Configuration

Stored at ~/.localgpt/config.toml:

[agent]
default_model = "claude-cli/opus"

[providers.anthropic]
api_key = "${ANTHROPIC_API_KEY}"

[heartbeat]
enabled = true
interval = "30m"
active_hours = { start = "09:00", end = "22:00" }

[memory]
workspace = "~/.localgpt/workspace"

CLI Commands

Chat & Ask

localgpt chat                     # Interactive chat
localgpt chat --session          # Resume session
localgpt ask "question"          # Single question

Daemon

localgpt daemon start    # Start background daemon
localgpt daemon stop     # Stop daemon
localgpt daemon status   # Show status
localgpt daemon heartbeat # Run one heartbeat cycle

Memory

localgpt memory search "query"   # Search memory
localgpt memory reindex          # Reindex files
localgpt memory stats           # Show statistics

Config

localgpt config init   # Create default config
localgpt config show   # Show current config

HTTP API

When the daemon is running:

EndpointDescription
GET /healthHealth check
GET /api/statusServer status
POST /api/chatChat with the assistant
GET /api/memory/search?q=Search memory
GET /api/memory/statsMemory statistics

Built With

Rust, Tokio, Axum, SQLite (FTS5 + sqlite‑vec), fastembed, eframe

License

Apache‑2.0

0 views
Back to Blog

Related posts

Read more »

Claude Code Is Being Dumbed Down

!Am I out of touch? No, it's the users who are wrong.https://symmetrybreak.ing/images/dont-worry-about-claude/am-i-out-of-touch.webp !Am I out of touch? No, it'...

NetNewsWire Turns 23

Release History NetNewsWire 1.0 for Mac shipped 23 years ago today! 🎸🎩🕶️ Current Work We just shipped 7.0 for Mac and iOS, and now we’re working on NetNewsW...