Settings

Theme

PydanticAI using Ollama (llama3.2) running locally

github.com

2 points by scolvin a year ago · 3 comments

Reader

eternityforest a year ago

So cool! I wonder what the weakest model that can still call functions and such is?

I don't have anything more powerful than an i5 other than my phone, and a lot of interesting applications like home automation really need to be local-first for reliability.

0.5b to 1b models seem to have issues with even pretty basic reasoning and question answering, but maybe I'm just Doing It Wrong.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection