Understanding LLM Compute Requirements
intuitiveai.substack.comInference is not that flop intense for modern IGPs... the bigger issue is the sheer memory requirements.
Training requires a ridiculous amount of both, but again the raw compute requirements would be much lower with huge memory pools.
A guide for the non-technical