Settings

Theme

Ask HN: People with new Macs / computers with GPU's, do you run LLM's locally?

3 points by vishalontheline 7 months ago · 2 comments · 1 min read


I am considering finally upgrading to a new computer - most likely an M4 Mac with the primary goal of running coding assistants and training my own models locally.

Is this a good idea? Have you tried it? How's the performance?

Thank you!

fiiv 7 months ago

I do this. Ollama makes it very easy - just pull the model you want. The great thing is the ability to test them in the same tasks, there's a huge difference in comparable models.

You can set it up in your editor of choice - I use Zed, and it's just listed as one of the providers you can choose.

In terms of performance - it works decently well.

bookworm123 7 months ago

Cool idea, however when I have to create an account for a service just to test it out, I will naturally decline to do so. Maybe you could upload a sample document for people to play with?

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection