Settings

Theme

Show HN: LocalClaw – Find the right local LLM for your exact hardware

localclaw.io

1 points by CDieumegard 2 months ago · 3 comments

Reader

CDieumegardOP 2 months ago

I built this because I kept seeing the same question everywhere / "what model should I run on my 16GB Mac?" or "will this fit on my 4060?"

LocalClaw matches your exact hardware (OS, RAM, GPU VRAM) to the right model + quantization out of 125+ options. Takes about 30 seconds.

It recommends for LM Studio specifically, shows file sizes, speed/quality scores, and explains WHY a model fits your setup (RAM usage %, VRAM fit, context window overhead).

Everything runs in the browser. No data collected, no account needed.

Built this in Switzerland, feedback welcome /|/ especially if a recommendation feels off for your hardware.

CDieumegardOP 2 months ago

I'm currently building a full MAC application to install HomeBrew, Node, LM Studio, an adapted LLM, and OpenClaw locally.

WIP V1 is done, but I want a one-click solution

CDieumegardOP 2 months ago

Major UI update on localclaw.io + Dark/Light version added

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection