So…with all this openclaw stuff, I was wondering, what’s the FOSS status for something to run locally? Can I get my own locally run agent to which I can ask to perform simple tasks (go and find this, download that, summarize an article) or things like this? I’m just kinda curious about all of this.
Thanks!
https://wiki.archlinux.org/title/Ollama
Ollama is an application which lets you run offline large language models locally.
I’ve had better luck with llama.cpp for opencode. I’m guessing it does formatting better for tool use.
I’m curious about this too. I know that on the latest version of Ollama it’s possible to install OpenClaw. But I assumed you needed to point it to a paid API (Claude, ChatGPT, Grok, etc.) for it to really work. But yeah, maybe it works with Qwen 3 or similar models?
I guess a major factor to this is what your system resources look like, especially howmuch RAM you have. And therefore which model you are hosting locally.




