v1.0.0 \u00b7 free & open source

Meet LlamaBoss.
Your local LLMs,
in a real desktop app.

A native Windows chat app for running local models. Multimodal, streaming, agentic \u2014 and entirely offline.

What’s in the box

Built to feel like a real app.

Two backends

LlamaBoss rides on top of Ollama. LlamaBoss Pro talks to llama.cpp directly.

Multimodal

Drop in images and text files. Chat about screenshots or paste right from the clipboard.

Agentic tools

Filesystem, shell, and workspace access — with confirmation gates so nothing runs by surprise.

100% local

Runs entirely on your machine. No cloud, no telemetry, no account.

Download

Grab the latest build.

Windows installer, MIT licensed, no account needed. Open the installer, click through, and start chatting with your local models.

Download LlamaBoss v1.0.0
Windows 10/11 \u00b7 x64 \u00b7 Requires Ollama

Coming soon LlamaBoss Pro \u2014 direct llama.cpp backend, no Ollama needed, with CUDA auto-detection. Currently in active development.