Settings

Theme

Ask HN: A good Model to choose in Ollama to run on Claude Code

3 points by sujayk_33 5 days ago · 1 comment · 1 min read


Given that Claude Code supports a locally running model on Ollama, which is the best Thinking Model that supports tooling, can I pick for good output?

Also, if anyone has tried, does it still require a Claude Subscription?

(I currently have an RTX 5060 machine with 8GB of VRAM)

parthsareen 3 days ago

Hey! One of the maintainers of Ollama. 8GB of VRAM is a bit tight for coding agents since their prompts are quite large. You could try playing with qwen3 and at least 16k context length to see how it works.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection