MiniMax M2.1
minimaxi.comThe HuggingFace link is published, but not working yet: https://huggingface.co/MiniMaxAI/MiniMax-M2.1
Looks like this is 10 billion activated parameters / 230 billion in total.
So, this is biggest open model, which can be run on your own host / own hardware with somewhat decent speed. I'm getting 16 t/s on my Intel Xeon W5-3425 / DDR5-4800 / RTX4090D-48GB
And looking at the benchmark scores - it's not that far from SOTA (matches or exceeds the performance of Claude Sonnet 4.5)
Only 10b active params and close to SOTA?
Right now is "Internal Server Error"
Here the saved page: http://archive.today/nDUc4
"Significantly Enhanced Multi-Language Programming, Built for Real-World Complex Tasks"