Settings

Theme

Understanding LLM Compute Requirements

intuitiveai.substack.com

4 points by pgspaintbrush 3 years ago · 2 comments

Reader

brucethemoose2 3 years ago

Inference is not that flop intense for modern IGPs... the bigger issue is the sheer memory requirements.

Training requires a ridiculous amount of both, but again the raw compute requirements would be much lower with huge memory pools.

pgspaintbrushOP 3 years ago

A guide for the non-technical

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection