Settings

Theme

I built an open source Computer-use framework that uses Local LLMs with Ollama

twitter.com

4 points by powerawq103846 9 months ago · 1 comment

Reader

powerawq103846OP 9 months ago

We just launched stable support for local LLM models with Ollama in Agent today - our Computer-use agent framework on macOS. Repo here: https://github.com/trycua/cua

Run models like Gemma3, Phi4, Qwen2.5, Llama3.1 & more supported by Ollama, keeping your data completely private - no cloud required.

Agent with local LLMs combines UI grounding with pixel-detection for accurate computer control with Pyautogen OmniParser - all in Cua's sandboxed environment for security.

Privacy-conscious? Local models through Ollama run entirely on your machine while maintaining the same powerful agent capabilities backed by MPS on Apple Silicon.

Simple to set up: just `pip install "cua-agent[all]"` and connect to your local Ollama models. Check out our examples on our repo: https://github.com/trycua/cua

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection