I recently read and reviewed Nick Harkaway’s Titanium Noir, a noir detective novel set in a world ruled by Titans, humans made immortal and superhuman through a drug called T7. Harkaway sets the book in a world that is static with technological progress frozen, and controlled by a tiny elite – the Titans.
The Titans have every incentive to keep it that way. If you intend to live forever, you want predictability. You suppress black swan events. You prevent anyone else from accessing the technology that made you powerful.
Like all good science fiction, Titanium Noir made me think of the current moment – about ASI and what the impact of a powerful new technology might have on society.
The Accelerationist Promise
The dominant narrative around ASI assumes dynamism.
Ray Kurzweil’s singularity. Dario Amodei’s “Machines of Loving Grace,” which imagines AI compressing a century of scientific progress into a decade. The promise is exponential takeoff: once we build superintelligent systems, growth compounds, scarcity dissolves, and we enter a post-scarcity future.
The doomers share this assumption of exponential takeoff, just with the sign flipped. Eliezer Yudkowsky’s scenarios (as laid out in “If Anyone Builds It, Everyone Dies“) and reports like AI 2027 project rapid, destabilizing change. Whether utopia or catastrophe, the shared premise is acceleration.
But is this a foregone conclusion? What if the incentives point elsewhere?
Infrastructure Investment
Trillions of dollars are being invested right now in AI infrastructure: data centers, chips, power plants. Microsoft is signing 20-year power purchase agreements. NVIDIA’s market cap rivals the GDP of mid-sized nations. The US has imposed export controls on advanced chips to China. This is concrete capital deployed by a small number of companies with the resources to play at this scale.
AI is constrained by compute, which is constrained by power, which is constrained by massive capital investment and regulatory approval. The entities building this infrastructure are building moats. And a sufficiently powerful AI system, controlled by a sufficiently small group, creates interesting incentives.
Does it make sense to continue to invest trillions of dollars in compute? At what point is the investment enough and are the returns justified?
The Stasis Thesis
Consider an alternative scenario. ASI emerges, powerful but without agency. Think of it as a super-powered, general purpose Claude Code – but without consciousness or autonomous goal-seeking behavior.
I think this is as plausible as the scenarios involving goal-oriented or “selfish” behavior that keep AI safety researchers up at night.
The ASI systems in this scenario are transformative, but also controllable, and controlled by those who built and own the infrastructure.
What do they do with it?
Titanium Noir suggests an alternative: freeze the world.
A small elite controls compute and power. A large population lives in stasis, perhaps supported by something like a Universal Basic Income, pacified and surveilled by these AI systems.
The technology that could enable abundance instead enables control. Growth stops because those in power benefit from predictability. Black swan events are suppressed. The world becomes static.
This is dystopia in the mundane sense. A world where nothing much changes, ever, because change threatens the position of those who own the infrastructure.
AI and Capital
In late December 2025, Philip Trammell and Dwarkesh Patel published “Capital in the 22nd Century,” arguing that while Piketty was wrong about the past, he may be right about the future.
Their thesis: once AI and robots can fully substitute for human labor, the economic logic that has historically raised wages breaks down. Capital accumulates indefinitely to those who own it. Wealth concentrates. The gains flow upward without limit.
This is pessimistic, but it still assumes dynamism. Growth continues; it just accrues to the owners of capital.
The stasis thesis is more pessimistic. What if those who control ASI don’t want continued growth at all? Does generating shareholder returns actually matter when economic growth becomes a non-factor?
ASI could be a technology capable of suppressing and controlling everything. Those who control the infrastructure now have a tool to assert complete dominance. And if maintaining control means ASI induced stasis, then it might be a price worth paying.
The Titans of Titanium Noir froze their world because immortality makes you conservative. Infinite time horizons make you risk-averse. You stop wanting change and start wanting control. Growth itself becomes a threat.
Trammell and Patel worry about inequality spiraling upward forever. I wonder if the ceiling is lower and harder: a world frozen in place by those who got there first.