Image missing.
Show HN: LocalGPT – A local-first AI assistant in Rust with persistent memory

created: Feb. 8, 2026, 1:26 a.m. | updated: Feb. 8, 2026, 6:52 a.m.

LocalGPTA local device focused AI assistant built in Rust — persistent memory, autonomous tasks, ~27MB binary. Inspired by and compatible with OpenClaw. cargo install localgptWhy LocalGPT? anthropic ] api_key = " ${ANTHROPIC_API_KEY} " [ heartbeat ] enabled = true interval = " 30m " active_hours = { start = " 09:00 " , end = " 22:00 " } [ memory ] workspace = " ~/.localgpt/workspace "CLI Commands# Chat localgpt chat # Interactive chat localgpt chat --session < id > # Resume session localgpt ask " question " # Single question # Daemon localgpt daemon start # Start background daemon localgpt daemon stop # Stop daemon localgpt daemon status # Show status localgpt daemon heartbeat # Run one heartbeat cycle # Memory localgpt memory search " query " # Search memory localgpt memory reindex # Reindex files localgpt memory stats # Show statistics # Config localgpt config init # Create default config localgpt config show # Show current configHTTP APIWhen the daemon is running:Endpoint Description GET /health Health check GET /api/status Server status POST /api/chat Chat with the assistant GET /api/memory/search?q=<query> Search memory GET /api/memory/stats Memory statisticsBlogWhy I Built LocalGPT in 4 Nights — the full story with commit-by-commit breakdown. Built WithRust, Tokio, Axum, SQLite (FTS5 + sqlite-vec), fastembed, eframeContributorsStargazersLicenseApache-2.0

6 hours, 4 minutes ago: Hacker News