Settings

Theme

The AI Industry's Business Model Is Cracking–DeepSeek Just Proved It

marvin-labs.com

8 points by alexdoesstuff 10 months ago · 3 comments

Reader

jqpabc123 10 months ago

Nothing has really been proven just yet.

I say this because all available LLMs are currently running largely on "funny money" --- subsidized by either venture capitalism or government.

We won't know the real costs until they are forced to survive in the marketplace on their own merits. And based on nothing but their energy and hardware needs, they won't be exactly "cheap" and will follow a "computing as a service" model subject to bait and switch tactics.

Basically, LLMs turn traditional computing upside down. Instead of reliable results at low cost, LLMs offer unreliable results at high cost.

And because of this, I expect the real world use cases to be much smaller than many seem to expect. The two prominent early examples are search engines (where accuracy is not essential) and research involving trial and error (where accuracy will be verified).

  • alexdoesstuffOP 10 months ago

    Author here

    I mostly agree on the first point. Even prior to the price race to the bottom, no AI Lab managed to make any money above marginal cost on inference, let alone recoup investment in infrastructure or model training. Clearly, investment in infrastructure and model training have been largely subsidized by VCs. It's a bit unclear how much of a subsidy inference costs had. The fact that AWS runs hosted inference at roughly similar cost than AI Labs suggests to me that there's at least not a massive subsidy going on at the moment.

    I don't subscribe to the narrative that nation states (i.e. China) massively support DeepSeek. Thus, while their core business as a hedge fund is clearly profitable, they have considerably less deep pockets and willingness to front losses than the investors in VC supported AI Labs. Consequently, I expect their inference cost to at least cover their marginal costs (i.e. energy) and maybe some infrastructure investment.

    All that suggests that they've managed to lower cost (and with that presumable resource and energy requirements) of inference considerable, which to me is a clear game changer.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection