Settings

Theme

GPT-2 implementation in Modular MAX

github.com

2 points by red2awn 3 months ago · 1 comment

Reader

red2awnOP 3 months ago

I am learning to write LLM pipelines using the Modular MAX inference framework. As a starting point I got GPT-2 working after reading through "The Illustrated GPT-2", Karpathy's nanoGPT codebase and existing models in the Modular repo. The MAX framework does require a lot of boilerplates and not designed to be very flexible, but you do gain awesome performance out of the box.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection