Personal Computer Large Language Models

Monday 18th March, 2024 - Bruce Sterling

How to run an LLM on your PC, not in the cloud, in less than 10 minutes

Cut through the hype, keep your data private, find out what all the fuss is about

Tobias Mann
Sun 17 Mar 2024 // 14:00 UTC

HANDS ON With all the talk of massive machine-learning training clusters and AI PCs you’d be forgiven for thinking you need some kind of special hardware to play with text-and-code-generating large language models (LLMs) at home.

In reality, there’s a good chance the desktop system you’re reading this on is more than capable of running a wide range of LLMs, including chat bots like Mistral or source code generators like Codellama.

In fact, with openly available tools like Ollama, LM Suite, and Llama.cpp, it’s relatively easy to get these models running on your system.

In the interest of simplicity and cross-platform compatibility, we’re going to be looking at Ollama, which once installed works more or less the same across Windows, Linux, and Macs….