Settings

Theme

My $600 Mac Mini Runs a 35B AI Model

thoughts.jock.pl

4 points by danebalia 23 days ago · 3 comments

Reader

bigyabai 23 days ago

> The 35B Trick (Your SSD Is the New GPU Memory)

Wave "bye bye" to your write cycles.

  • RobMurray 23 days ago

    why? it's mostly reads. the weights are static.

    • bigyabai 23 days ago

      llama-cpp's process is, but macOS itself will swap hard when 10-14gb of memory is paged for LLM inference. Dense models especially would thrash zram.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection