Image missing.
How to run an LLM on your laptop

Grace Huckins

created: July 17, 2025, 5:01 p.m. | updated: July 22, 2025, 6 p.m.

For Pistilli, opting for local models as opposed to online chatbots has implications beyond privacy. Local LLMs may have their quirks, but at least they are consistent. The only person who can change your local model is you. “Running local models is actually a really good exercise for developing that broader intuition for what these things can do,” Willison says. If you’re comfortable using your computer’s command-line interface, which allows you to browse files and run apps using text prompts, Ollama is a great option.

5 months ago: MIT Technology Review