Settings

Theme

Is the Nvidia RTX 3060 the best value for entry level local AI?

aiflux.substack.com

3 points by ericdotlee 3 months ago · 2 comments

Reader

ericdotleeOP 3 months ago

Hi, I'm looking to transition from renting GPU's from RunPod to hosting some models locally - specifically qwen-2.5 and some lightweight VLM's like Moondream. It looks like the RTX 3060 12gb is a relatively good option but I don't necessarily have a lot of experience with pc hardware, let alone used hardware.

Curious if anyone here has a similar config of 1-4 RTX 3060s? Trying to decide if picking up a few of these is a good value or if I should just continue renting cloud GPU's?

  • leakycap 3 months ago

    Have you set up hardware/software to run models locally before? Not that it's incredibly hard, but the cloud GPU providers usually simplify a lot of the steps.

    The jump from 1 RTX running a local model to multiple RTX is a big jump that I'd wait to tackle until I was very comfortable with 1. Which might mean you want to do something with more VRAM instead of the 3060.

    Starting with 1 card doesn't work for everyone; but you're biting off a lot vs. renting a cloud GPU and it can take a lot of time to get a local setup going for the first time.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection