Who is using Ollama day-to-day?
I maintain a Chrome extension that adds an AI support to any input field online.
One of my favorite reviews is a 2-star review asking for “local model support”.
My first reaction was: who installs a 100KB Chrome extension to talk to a 10GB model running locally?
But it did make me curious.
Are people actually running Ollama or LM Studio as part of their daily workflow? Im building a script for my own dev to spin up droplets on DigitalOcean to run high performing LLMs like H100 - this is all prompted through open web ui (ollama) . I find the models my 12GB GPU will tolerate have too many hallucinations - its ok for diffing for example or basic scripts, but anything somewhat complicated I need 60GB+ GPU which isnt feasible for me to buy.. Yes, I do. It is quite helpful. For main coding tasks I use Claude, but if my credits vanish or I have enough time to get an answer, I use Ollama extensively. I would recommend for developers to maintain their own AI pipeline as well. For anything dealing with personal data, like browser inputs, I would exclusively use local models too. Probably still niche, but non-local AI would be a deal breaker for me for both browsers and OS. I'm curious what the workflow actually looks like for people running Ollama day-to-day. Do you mostly use it through the terminal, a UI like Open WebUI, or via integrations with other tools? I’m trying to understand where a browser integration would actually fit - if at all I use Cursor for Claude mostly, I just like it better than the claude console version for some reason. For Ollama I switch to VSCode and use roo code/cline mostly. I have some a Gitea/Playwright mcp/webservice setup, which works well with both. But for that latter, I tend to restrict it to local models. I am running but it is only useful for very easy conference tasks, or either it needs a very high computing power. Currently I was running on a 32GB Mac Studio M1, and I mostly use it for generating commit messages. if you don't care don't waste your time. that user should make their own 100kb extension. I went ahead and did it... XP At the start, I was curious to see if I could. It ended up being a rabbit hole, of course. I can see some use cases for it. The plan is to dogfood it and validate them.