Settings

Theme

What is a good machine to run LLM's on?

3 points by rabbitofdeath 2 years ago · 4 comments · 1 min read


I would like to really just run OpenwebUI and a few models for local chat use. I'm not into training (yet) and am patient- what is a good cost effective way to get started?

LorenDB 2 years ago

VRAM is king if you want to run larger (and therefore more accurate) models. 12 GB VRAM will let you run 13B models, which are great for local chat, but you could get away with 8 GB VRAM to run an 8B model as well; I'd recommend Llama 3 8B for that.

talldayo 2 years ago

A cheap Nvidia GPU with lots of VRAM, like the 3060 12gb model. About the fastest you can expect for the lowest amount of money.

FlyingAvatar 2 years ago

Any M-series Mac with RAM larger than the model you want to run on it.

rabbitofdeathOP 2 years ago

Thank you!

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection