What Do Google's AI Answers Cost the Environment?
scientificamerican.comThis seems to be missing the most obvious optimization that Google are clearly doing. When you see an AI answer to your question on Goole I guarantee that Google hasn't gone off and put your query into an AI. What it has almost certainly done is stick your query into the same processing engine that all the other queries go to, and found a flag that says "Oh I have a pre-computed AI answer to this" and returns that. It's probably of the same order of magnitude as serving that little wikipedia summary that it shows for prominent people. Google isn't in the game of hand crafting answers to your queries so the one-off cost of putting that query into the AI is amortized over billions of answers it serves.
Caching definitely helps, I agree there's no way Google would be wasting so much re-running every search token through Gemini.
That said, it's still worth calling out that their Gemini answers use drastically more resources whenever they are run and cached. We'd have to know Google's caching rules and the average frequency of cache hits to know how much it actually reduces the Gemini resource usage.
Yes, but all this compute is great for Google's Cloud Business revenues!
Smort. Monopolies gonna monopoly.
It’s a good thought, but I asked a pretty specific question (where can you get in to swim in the Napa river) the other day, and it had a generally correct answer, but then refined it to “within 5 miles of yountville” and it came back with an even more specific answer. I don’t think this was precomputed, though I could be wrong.
> She and her colleagues calculated that the large language model BLOOM emitted greenhouse gases equivalent to 19 kilograms of CO2 per day of use, or the amount generated by driving 49 miles in an average gas-powered car.
If you go on to read the paper, you discover this is an outright lie. These figures are for training, not usage.
So there are probably hundreds of thousands of times more greenhouse gases produced from google having employees go into the office, than they use for training
The return to office was necessary for Google to have a coherent product strategy \s
Uhm, well even if it was for usage that's a pretty small amount of emissions. For training abysmally small
This is such a strange line of thought. What would it cost a human to go through the same training data and come up with a similar summary? Across all the fields and languages that Google has access to?
Even compared to regular indexing, it's not exactly apples to apples. You still have to go through a dozen or more links and manually read them and filter out spam before getting a good idea.
It's be fairer to compare the LLM summary to some other scraper and summarization system.
I agree, also think about just how much paper one should print/buy in bookform to get the same data.
That said Jevons Paradox seems to be a hardcoded feature of the physical underpinnings of modern society, so we're probably headed for some sort of "busybody" machine world with AI's everywhere that are highly efficient but also highly wasteful.
Very few modern process so far has lead to more efficient / scarcer use of earths resources, well mirrored in the ever increasing layers of abstraction in computing where a supercomputer at home can have trouble rendering a basic desktop interface without latency these days.
> Very few modern process so far has lead to more efficient / scarcer use of earths resources
I think that's not true. The amount of material needed to make a computer, and the power to drive it, has gone down incredibly since valve-based, room-sized computers, alongside the simultaneous massive increase in power.
And there are many more of them.
> the ever increasing layers of abstraction in computing where a supercomputer at home can have trouble rendering a basic desktop interface without latency these days.
This is such an embarrassment. It is lunacy that a modern desktop OS feels more sluggish than a late 90s box.
I wonder how much of the perceived lag in modern OSes would be mitigated by a buzzing chunking HD or floppy though. Like you could hear the computer working.
> Very few modern process so far has lead to more efficient / scarcer use of earths resources,
Scarcer? Probably right. More efficient? That seems very, very wrong. We're constantly looking for more efficient ways of doing and building things - if you'd like to frame this in a cynical way, look at it like this: businesses/industries are always looking to reduce costs. Sometimes this is done by firing thousands of people, sometimes this is by using more efficient materials/processes.
> Very few modern process so far has lead to more efficient / scarcer use of earths resources
Could you list some of these processes you are referring to?
There will be a massive drive to do LLM inference cheaper at lower power. Also there is a huge gold rush in AI now. Who knows what will survive? Who will need GPUs?
Finally there are advantages of scale to centralizing resources that will create high degree of incentive and specialization to drive down cost and power consumption at these data centers.
It’s just gonna be insane until the near term uncertainty is ironed out.
Rather like the energy consumed by spam/antispam or the energy cost of unwanted advertising, the energy cost of an unwanted LLM response at the top of search queries is just another thing consumers are going to have to put up with being made to feel guilty about.
Why do users need to feel guilty? They should know the cost, and from there they can either accept it and keep using Google or go somewhere else.
There's always the end case where no search tools offer LLM-free results, but even then a person can still just not use search if they really care.
Its just about informed consent in my book.
Why would spam energy make consumers feel guilty?
I've seen discussion about analog circuits which might do a lot of the work for much less power. I hope people are seriously investigating it. I think maybe analog computers were abandoned too quickly.
The issue with analog computers is error accumulation, non determinism due to environmental factors and calibration requirements.
However you can have an architecture that deals with the error well enough to make useful estimates. Artificial and biological neural networks for instance.
Some neuromorphic chips use analog components for storage and processing. Some people are trying to go the other way around and grow biological neurons in a chip (indeed, very low power, though maybe more effort to feed it nutrients).
Yes, but isn't non determinism a feature of AI.
This is just another reason to tax externalities and be done with it.
Guilt tripping on totally arbitrary things just isn't going to work.
We already have a mechanism for managing scarce resources, it's called price, so let's just use it.
How does a tax fix it? Sure the government takes in the money and could even do right about how they allocate spending it, but the damage (so to speak) is caused before the tax is ever collected.
Consumers just need to know how it actually works, at least at the most basic level of what the costs are. From there either consumers don't care or stop using it. Do we really need the government forcing more morals on us through another sin tax?
My point is that if oil extraction, refining, burning etc is taxed then no-one has to worry about any of this.
That worrying about whether doing bog standard everyday activities like turning on a computer and running a program is "bad" is just silly and even if it did work would cause mass anxiety.
It's not a sin tax, it's a tax on causing externalities. There is no need for an AI tax because AI is not in and of itself a thing which causes environmental damage.
Well, unless it goes Skynet, but I'm considering that out of scope.
I still don't quite follow you here, how does a tax make it something that no one had to worry about? We tax fuel, for example, but people are still worried about emissions and environmental impacts.
I call it a sin tax in this case because it would be a government taxing something they don't want people doing, or want us doing less of. Where income tax isn't meant to make people want to reduce their income, taxes on alcohol, cigarettes, or in this case oil extraction or burning are specifically implemented to move the market.
The price at use is free, so how much should we tax it?
At source. Taxes on the sale of LNG, taxes on the production of GHG by power plants, fuel duty, potentially on activities like mining, etc.
Setting the rate is more complex, but anything at all would be a start.
It would be a start - impossible not to be - but you might suffer adverse consequences. Either: you make other things expensive you didn't mean to, through the wrong granularity of rule, or you didn't catch people who slightly changed their position to not have the rule applied to them any more. And either case: lots of energy spent defining rules and having each company analyse whether or not its affected and then what to do about it.
Sure.
It is still substantially more effective and easier than trying to look at literally every individual thing we do and guilt people into doing different things.
It's just not going to work, people will use AI anyway.
I don't see why taxation is a good solution to guilt. Information is. Is Google using anything but renewables powering the AI stuff? Is this even in the top million things people do that generates pollution in some way? Until you know, why would you feel anything?