Settings

Theme

PewDiePie using self-hosted AI

youtube.com

4 points by deevus 2 months ago · 2 comments

Reader

chasing0entropy 2 months ago

I also use self-hosted LLMs. You can make three GTX 1080s run a 7b model competently at limited context through ollama. Get a little more bold with LM studio and you can actually make a coherent and sort of reliable model.

keyle 2 months ago

on macOS if you opted for 32GB you can run a GPT4-oss model with LMStudio really easily.

It's "good enough" for a lot of questions and doesn't go up and down like a yoyo (OpenAI dashboard lies)

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection