MLC LLM: 70B Llama-2-4bit on MacBook at 50%-80% speed of A100
twitter.comMemory is all you need, and MLC!
7B version is available in AppStore for iPhones and iPads: https://t.co/NLqe9mypgB
And I confirm it's also runnable on my M1 Pro MBP
At least Meta is giving you the entire model to play with for free right now at $0. Unlike O̶p̶e̶n̶AI.com running to government to stop all of it.
In this race to zero, Meta is at the finish line and is creating an ecosystem for everyone (Except those with 700M+ DAUs) to use their Llama LLM and commercialize it. No API or gated models at all.
This will only accelerate and even Llama 2 has caught up to the equivalent of GPT-3.5 already and with MLC LLM it can run on a MacBook.
Meta has a moat with a $0 free LLM which is as good as GPT 3.5. O̶p̶e̶n̶AI.com cannot raise their prices and their so-called 'moat' is rapidly being eroded.
Yeah I believe countless new research and product ideas will be built on top of the open-source Llama-2