Settings

Theme

AI API Prices are 90% Subsidized

tinyml.substack.com

27 points by csoham 7 months ago · 13 comments

Reader

python273 7 months ago

A much better article on token prices: https://www.tensoreconomics.com/p/llm-inference-economics-fr...

There's not much incentive to subsidize prices for OpenRouter providers for example, and the prices are much lower than the $6.37/M estimate from the article.

https://openrouter.ai/meta-llama/llama-3.3-70b-instruct

avg $0.37/M input tokens, $0.73/M output tokens (21 providers)

Llama is not even a good example, as the recent models are more optimized using Mixture Of Experts and KV cache compression.

apsec112 7 months ago

This ignores batching - token generation is much more efficient in batch - and I strongly suspect is itself written by AI, given the heavy use of bullets

PaulHoule 7 months ago

When the AI hype train left the station I said "we don't understand how these things work at all and they're going to get much cheaper to run" and that turned out to be... true.

Already vendors of legacy models like ChatGPT-4 have to subsidize inference to keep up with new entrants based on a better foundation. It's likely that inference costs can be brought down by another factor of ten or so so of course you have to 90% subsidize these to get where the industry will be in 2-3 years.

GaggiX 7 months ago

This calculation doesn't account for batches, it makes no sense.

impure 7 months ago

I’ve been playing around with Gemma E4B and have gotten really good results. That’s a model you can run on a phone. So although prices have been going up recently I suspect they will start to fall again soon.

daft_pink 7 months ago

Also, it ignores the fact that they will optimize it and make it more efficient like Moore’s law, so everyone is basically assuming that the price will come down over time.

mrtksn 7 months ago

Subsidized is probably not the correct word here, it's probably more like loss leader in the race of the land grab.

It's like the early days of the internet when everything was amazing and all the people who put money into this thing were "losing" their money.

It's going to be like this until monopolization and moat becomes defensible and then they will enshittify the crap of it and make their money back 10x, 100x etc.

revskill 7 months ago

No lol. The quality is mostly bad. Basically u need to prompt in detail like writing a novel for llm to understand. At that price, we want real AI who can really have common sense, not just an autocompletion tool.

Stop adverting LLM as AI, instead sell it as a superior copy & paste engine.

What's worst about LLM, is the more you talk with it, the worse it became to the point of broken.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection