Settings

Theme

Silicon Valley is quietly running on Chinese open source models

old.reddit.com

5 points by nreece 23 days ago · 5 comments

Reader

operatingthetan 23 days ago

I'm using minimax m2.7 and it's good enough. What I'd like to understand is how these models are so cheap though? Surely it costs them just as much for the compute? Do US-based AI companies have that much overhead?

  • yorwba 23 days ago

    There are US-based companies offering inference for MiniMax models charging slightly less than what MiniMax charges. MiniMax themselves claim to be using data centers in the US. US companies training their own closed-weight models charge so much more because they can. They're monopoly providers for their own models, so they can ask for whatever amount people are willing to pay.

gostsamo 23 days ago

tbh, models in pipeline are cheaper if local is comparable only to warm water is nice and relaxing. The cursor case is a bit different, but it is because cursor cannot be profitable while competing with their providers and it is not clear yet if they will survive at all or the kimi model will prove itself as a good competition.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection