Settings

Theme

Ask HN: You have 50k USD and want build an inference rig without GPUs. How?

3 points by 4k a year ago · 1 comment · 1 min read


This is more like a thought experiment and I am hoping to learn the other developments in the LLM inference space that are not strictly GPUs.

Conditions:

1. You want a solution for LLM inference and LLM inference only. You don't care about any other general or special purpose computing

2. The solution can use any kind of hardware you want

3. Your only goal is to maximize the (inference speed) X (model size) for 70b+ models

4. You're allowed to build this with tech mostly likely available by end of 2025.

How do you do it?

sitkack a year ago

You wait until someone posts an answer here, https://www.reddit.com/r/LLMDevs/comments/1if0q87/you_have_r...

https://www.phind.com/search/cm6lxx6hw00002e6gioj41wa5

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection