What's Your AI Footprint?

3 min read Original article ↗

AI costs energy, lots of it. But how much does your usage cost?
I wanted to find out, so I built Claude Carbon.

Ever since hearing about AI’s staggering energy consumption, I’ve had a nagging guilt about my own usage. I mostly ignored that feeling, until I recently no longer could, for two reasons:

  1. I read Karen Hao’s Empire of AI, essentially a history of the current AI revolution told through the story of Sam Altman and OpenAI. If even half of what’s in the book is true, it’s still absolutely shocking.1

  2. I’ve been using Claude Code for 90% of my AI tasks, and you see the tokens racing upwards next to each prompt, quickly numbering in the thousands or even tens of thousands.

This got me wondering: what does my AI-related energy footprint look like, and how bad is it really?

At the macro level, bad without question. Data centers consumed about 4% of US electricity in 2023, and are projected to consume up to 12% of US electricity by 2028, with AI driving most of that growth. OpenAI’s planned Stargate facility alone would require 2.2 gigawatts, more energy than most cities use.

But what about individual users like myself?

That’s what spawned Claude Carbon. It monitors my token usage in Claude Code and translates that into energy equivalents, like phone or laptop charges.

Based on my research and a month of tracking,2 my actual usage is 12.7 million tokens per day. That translates to 100-200 phone charges, or about 15% of a typical US household’s energy consumption. (And this only measures Claude Code. My actual footprint is higher through other AI tools I use.)

I’m not certain the methodology is airtight. But if these numbers are even close to accurate, my energy usage is much higher than I expected.

Even though most casual ChatGPT users consume a fraction of this, there is an ever-growing number of power users like me. If a million such users consume energy at similar rates to mine, that’s 1.6 TWh annually, enough to power 150,0003 homes.4

Claude Carbon makes this visible by translating token usage into energy equivalents. I can’t unsee my numbers now, and they’re making me uncomfortable.

I don’t want to stop using AI, but I also can’t ignore this footprint.

At least as a first step, Claude Carbon is making me ask better questions:

  • Is this task worth it?

  • Could a lighter model handle this?

  • Do I really need to generate this response?

And perhaps the app can go from informational to actual behavioral nudges, like challenges to beat your lowest daily usage and insights that suggest a lighter model for certain tasks.

Say those million power users would shift 30% of their tasks to a more efficient model like Haiku instead of Opus. That’s 340 GWh saved per year, the equivalent of a small city of 32,000 homes.

There’s an obvious irony here: I use AI to build an app about AI energy consumption. But the alternative is ignorance, which seems worse.

Claude Carbon currently only works in Swift on macOS (you need Xcode to run it,5 but it’s 100% open source under an MIT license and available on GitHub. The Github repo includes the underlying calculations and research, so you can scrutinize the numbers or help improve the app.

Discussion about this post

Ready for more?