Settings

Theme

Ask HN: What tiny LLMs are you getting the best results from?

4 points by chrisrodrigue 10 months ago · 1 comment · 1 min read


Curious if anyone here is having any success with running smaller LLMs locally on constrained hardware, such as laptops or GPU-less devices. If so, what kind of utility have they brought you?

Uzmanali 10 months ago

I’ve had solid luck with TinyLlama and Phi-2 on my MacBook Air (no GPU). It's great for quick drafts, note summaries, and basic Q&A. No internet needed, so it’s super handy when traveling.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection