Solar and batteries for generic use cases

9 min read Original article ↗

A brief note on using solar and batteries as generic power sources.

Over the last few years of work at Terraform Industries, we’ve developed several useful heuristics to understand how rapid progress in solar and battery costs will change industry.

This includes the bifurcation of industry between low capex, high energy material processing (eg synthetic fuels) and high capex high utilization end uses (eg data centers).

This is important because energy abundance is key to growing prosperity.

(Dotted lines by Jesse Peltan.)

Solar panels are a revolutionary technology, one in which manufacturing creates energy as an output not merely an input. Inert glass rectangles that, placed on the ground, print out wealth, at roughly 100x the rate of the best farm land. We should deploy as many as we can!

To catch you up, here are some charts of recent progress.

Since 2009, solar has been on a tear, getting 44% cheaper per doubling of cumulative deployment and coming down in cost roughly 15% per year. Last year, we deployed 446 GW world wide (equivalent to the global nuclear fleet, and about 1 MW per minute) and this year, China alone deployed 160 GW in just the last 9 months. All indications are that this trend continues – indeed since I made this chart earlier this year, the cheapest panels have fallen to just 7c/W, considerably below trend.

Meanwhile, the complimentary technology of batteries has also continued its crushing progress, cutting 23% of cost per doubling of cumulative deployment, and recently hitting lows of just $45/kWh, a previously unthinkable price.

I became interested in the general trade off between solar over build, battery allocation, utilization, and capex, and set out to solve the problem.

First, I downloaded some publicly available data on real world solar performance. Here, I used a site in Wyoming which is relatively cloud-free, albeit cold and higher latitude than the sunniest locations. It is a good proxy for typical solar abundance in places where the wealthiest humans live (the US and Europe) and where we should expect the localizing trend of energy generation and consumption to gain speed.

Next, I built a basic model to assess the utilization of a load of a given size coupled to a solar array and variably sized battery, collecting data over the entire year in 5 minute increments. Then I crunched the data to obtain useful insights.

First, let’s take a look at the value-optimal utilization of some load as a function of its capex, assuming $200k/MW solar PV and $200k/MWh battery storage.

I expected to see two stable attractors, one for lower utilization and low capex, and one for higher utilization and high capex, but the abruptness of the transition surprised me. For loads below $1000/kW, intermittent operation with some modest solar array oversizing is the winning strategy. Above $1000/kW, we charge and discharge batteries daily to run them.

As solar and batteries continue to decost, incrementally cheaper loads will switch from the intermittent strategy to the battery-stabilized strategy. This is particularly important for technologies that are around $1000/kW. If you’re developing an intermittency-friendly use case for off grid solar and its capex is in this range, you will have to ensure you keep up with solar’s cost improvements or else your competitive advantage (the ability to run intermittently) will disappear as, essentially, batteries get cheaper than your capex reduction.

One example I can think of is intermittency-friendly reverse osmosis desalination gear. RO typically costs ~$2000/kW, placing it firmly in the battery-backed strategy. One could imagine investing billions of dollars to build a cheaper RO plant that was also intermittency friendly – but it would be a race against solar and battery cost progress, which is up to 20% per year. Alternatively, one could start with $100/kW technology and build a desalination system around them, then optimize for efficiency while keeping capex low.

Let’s take a closer look at how these systems are actually deployed. In this chart, we show a range of different loads by capex, together with their utilization, graphed on a chart showing the size of their solar and battery systems. These points are chosen to maximize value for each capex level.

Starting from the lower left, an extremely cheap load ($10/kW, comparable to an old electric kettle) is best run with a solar array of equivalent size, and an average utilization of just 0.21.

As we move up and to the right, we’re increasing the capex of the load. The next datapoint, at $1000/kW, is far beyond the capex of the Terraformer, which still operates very much in a pure solar mode. But the $1000/kW load achieves a utilization of 0.33 despite the intrinsic solar utilization in Wyoming being just 0.17, by running on an overbuilt PV array, with nameplate peak power 2.4x higher than the load’s demand. This pure solar peak shaving strategy is the minimum cost, maximum value point.

Beyond $1000/kW, additional value growth and higher utilization is achieved through a rapid growth of the battery pack, up to a limit of about 15 kWh/kW-load, or enough to run the load at full power for 15 hours. Through this range we’re also linearly increasing the size of the PV array – at $10,000/kW we’re at about 8.3x overbuild by nameplate, though if you multiply 8.3×0.17, we’re generating only 40% more power over the course of the entire year than we’re using, and that wastage is occurring during the summer. We’re also at 0.94 utilization, 3x higher than the no battery mode.

Surprisingly (to me at least) beyond this point we don’t see further increase in utilization through increased battery capacity. A battery that’s discharging every day gets great utilization, but one that’s only there in case the sun doesn’t come up for two days in a row, barely gets used. Instead, the optimal approach to increase utilization is to continue to scale up the solar array size, eventually hitting 17x in terms of peak power, or about 3.4x in terms of average power – enough to take care of consecutive gloomy winter days and squeeze out those last few percentage points of utilization.

Look closely at how far apart $1000/kW and $2000/kW are! And yet, as solar and batteries get cheaper, the $1000/kW data point will jump up the cusp into the higher utilization zone in just a few more years.

For reference, an AI training data center costs around $50,000/kW, and this simulation shows that ~0.98 utilization is cost-optimal. This is equivalent to ~170 hours per year of zero operation, or more likely, 1700 hours of degradation to 90% of full power. Of course it is possible to achieve utilization closer to 99.9%, but you’ll need a bigger solar array.

Finally, let’s take a look at power system cost as a function of desired utilization.

Reflecting what we found in the AI datacenter post, there’s a smooth cost increase for increasing utilization. Below about 0.3 utilization, no batteries, pure solar. Between 0.3 and 0.9, solar cost increases linearly while battery cost rises more steeply – they are equal at around 0.5 utilization. Above 0.9, it’s all solar PV overbuild, because it’s cheaper to squeeze more electricity out of a short cold cloudy day than it is to carry around a spare day’s worth of batteries you might use once per year.

In summary, while solar power is able to pair with batteries to provide essentially 100% utilization, or load shift for grid utilization, there’s a previously unexploited opportunity for ultra low cost power with lower utilization – which is where the Terraformer lives.

Edit Dec 2024:

Recently my friend Duncan Campbell over at Scale Microgrids published a great report looking at off grid solar+battery power production for AI hyperscaling data centers in the sunnier parts of the US. (I wrote a post on this in March 2024). I got a sneak preview and was able to point out that their solar cost numbers, iirc around $0.85/W, were a lot higher than the current minimum cost (9c panels, 8c mounting including labor) and that if they adopted more aggressive solar cost numbers, they could delete their dependence on natural gas for the last 10% of the power back up. Well, we can see which way the wind is blowing!

So I reran the code used for the post I wrote last month (above) for this specific site and use case, and I’ve made it available for all of you!

Raw data and code is available here. I’m not sure if this sort of math is really hard or if I’m overthinking it, but making sense of this problem took a fair amount of brain sweat.

Here’s the data from a site in Texas.

This chart shows how subsystem costs vary as a function of load capex for optimum utilization. For example, there’s no point using batteries unless your load capex is over $1000/kW. As utilization increases towards the right, the gap between capex and utilized capex narrows. Power system capex climbs sharply around $1000/kW but is otherwise pretty flat.

This chart is the same as the one above, but for a different location. Similar properties, though. It shows directly how optimal utilization varies as a function of load capex.

This chart shows effective power cost per MWh as a function of load utilization. I haven’t figured out how to combine this nicely with the chart above, but it’s easy enough to read off the optimal utilization and then check the power cost below. Importantly, solar + battery power amortized over 10 years ranges from about $12/MWh to $50/MWh over most of the range – far lower than any other electricity source. Even for 99.9% uptime, the power cost only increases with some extra solar build.

For completeness, here’s a year’s worth of power production at the north Texas site that was studied.