Settings

Theme

End of Transformer Era Approaches

manifestai.com

15 points by obiefernandez a month ago · 1 comment

Reader

alyxya a month ago

> The training budget for this model was $4,000, trained 60 hours on a cluster of 32 H100s. (For comparison, training an LLM of this scale from scratch typically costs ~$200k.)

What they did is closer to fine-tuning, so this comparison isn’t helpful. The article is transparent about this at least, but listing the cost and performance seems disingenuous when they’re mostly piggybacking off an existing model. Until they train an equivalently sized model from scratch and demonstrate a notable benefit, all this looks like is at best a sidegrade to transformers.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection