Everybody is now an (underpaid) CTO

11 min read Original article ↗

Software EngineeringAgent BabysittingLeadershipOptimisitic DoompostingWhy Did I Check My Portfolio

Remember when everyone thought AI would make software development easier? It did. Whether you like it or not, it has completely transformed the industry.

I am not an AI denier - this is not an AI hate post. We've had enough of those (I think I've read 10 in the past 2 days). It's time to think about the repercussions, the cascading effects of AI on work: responsibilities, expectations and compensation.

A lot more is expected from software engineers now than 1/2/5 years ago. Often these expectations are coming from people with non-technical backgrounds who are enchanted by AI lab promises of productivity gains and business transformation, no issue with that sell what you have to sell.

You'll see it when you scroll through what is left of social media, the latest trends are being promoted and praised by project managers, recruiters, business analysts, executives, GTM. AI has given them a window into our wonderfully creative, technical and rewarding world (good for them!). These expectations don't translate and the result is engineers are expected to deliver 5x more, 5x faster - while managing agents i.e. 3-4 hyperproductive, non-deterministic, adderall-addicted senior engineers that can't make their own decisions and have zero autonomy and zeroooooo creativity.

You've been promoted to CTO, yet your salary didn't change. Disagree? Don't worry you can vote on it later. For now it's my job to convince you.

The Constantly, Rapidly, Unpredictably Changing Tooling Landscape

First, let's look at the recent agentic offerings to gauge the direction enterprise, white-collar work is heading in:

  • OpenAI released Codex (CLI and app) and Frontier. Exclusively for non-production agent management, so literally managing subordinates.
  • Anthropic released Claude Work and Claude Code (CLI and desktop).
  • Google released Antigravity complete with agent SWARM management, Gemini CLI, and probably 10 other identical projects spearheaded by a TPM trying to get promoted before abandoning the project entirely.
  • Microsoft? I'll leave it at Copilot and you can try to figure out which copilot that is.

I actually don't have an issue with this increased workload, responsibilities, delegates, deliverables, information, code, documentation, tests, deployments, infra, MLOps, tools, pipelines etc. "It's part of a changing landscape."

Its life, cog in the machine and the machine just got lubed to the gills. I'm not going to stubbornly handweave in an industrial revolution where OpenWeave just released Silk Weaver 3000.

The issue is that compensation has remained stagnant.

Efficient Markets Hypothesis

In a brief discussion on HN with simonw about this - he argues these productivity skills become leverage when you apply elsewhere. That the market will correct once skilled devs apply elsewhere, get offers and negotiate raises.

I haven't seen any examples of that. Job postings have gone from "Comfortable using AI assisted programming - Cursor, Windsurf" to "Proficient in agentic development". Yet the salary range has remained the exact same. Companies are essentially expecting junior/mid level devs to have management skills.

HN Discussion

It doesn't just apply to SWEs, these changes apply to every enterprise worker (although they are disproportionately affected and what I am qualified to speak about). In the past a SWE would have a sprint in which they plan features, bug fixes, refactors, documentation upgrades etc etc with their team. Throughout the week they'd tackle their assigned Jira tickets, submit PRs, review PRs, write tests, have some time to learn even if they are worked to the bone.

What does that look like now?

You wake up. Probably don't even send a good morning message to your team. Instead, Good morning Claude (I'm guilty of this) and you immediately pick up exactly where you left off. Managing 3-4 (any more and you're a psycho and you are not steering your agents well) devs on crack.

Barely 6 months ago it was pair programming, you'd work with an LLM in Cursor manually verifying every change, copying snippets, asking questions. It still is for a lot and still can be.

But it's obvious, from the tooling push, the industry is not heading in that direction.

Corporate Kramer Could you keep it down to a low roar? Some of us have to prompt Claude in the morning.

Analysing Language

Looking at the recent launches:

OpenAI Frontier + Codex:

"Frontier, a new platform that helps enterprises build, deploy, and manage AI agents that can do real work."

"75% of enterprise workers say AI helped them do tasks they couldn't do before."

"At OpenAI alone, something new ships roughly every three days, and that pace is getting faster."

Hyperacceleration, more work - faster.

"AI coworkers need the same things [as people]: onboarding processes, institutional knowledge, learning through experience, improve performance through feedback, access to the right systems and set boundaries."

"A command center for agents."

Personnel management.

"the gap between early leaders and everyone else is growing fast."

It will happen to you and your company. They're pushing for it.

Google Antigravity:

"a mission control for spawning, orchestrating, and observing multiple agents across multiple workspaces in parallel."

"We are transitioning to an era, with models like Gemini 3, when agents can operate across all of these surfaces simultaneously and autonomously."

Anthropic Claude Cowork + Desktop:

"Engineers are shifting from writing code to coordinating agents that write code, focusing their own expertise on architecture, system design, and strategic decisions."

Mission control, command center, orchestrating, multiple agents in parallel. Pure org management.

Breaking news! As I write this Anthropic released Claude Opus 4.6. And with it a new concept: Agent Teams.

"One session acts as the team lead, coordinating work, assigning tasks, and synthesizing results." "Monitor and steer: Check in on teammates' progress, redirect approaches that aren't working, and synthesize findings as they come in."

Apparently we're not managing independent agents anymore. As I'm writing about how AI is forcing mid-level devs into tech lead responsibilities. I find out I'm being promoted LFGGGGGG! Now I'm managing teams of agents. You're now managing agents that manage themselves and each other while you oversee the entire operation. CTO baby. Six months ago it was pair programming. L4 to CTO in 6 months.

And just now GPT-5.3-Codex drops dkm:

"As model capabilities become more powerful, the gap shifts from what agents are capable of doing to how easily humans can interact with, direct and supervise many of them working in parallel."

What can I say that I haven't already said?

Let's not even talk about how much this has crept into your daily life. With Cursor you can manage agents from your mobile.

Peloton Coding Or from your Peloton???

Promoted Without A Raise

The main point is: Are developers and enterprise workers seeing compensation increases proportionate to these flaunted corporate productivity gains? More importantly, are they seeing salary increases proportionate to the sheer amount of new and increased responsibilities?

When you jump from L4 (mid-level) to L5 (senior), you get a salary bump proportionate to your increased responsibilities.

At Meta, that's $322K → $486K (51% increase, $164K bump). At Google, it's $297K → $400K (35% increase, $103K bump).

What do you get for that extra six figures? Your scope expands from team-level execution to multi-team coordination. You start mentoring other engineers. You own entire systems instead of components. You create scope by finding and solving ambiguous problems. You translate business needs into technical strategy, design systems, make architecture decisions, coordinate dependencies, DELEGATE work across your team.

Sound familiar?

In our new wonderful agentic paradigm mid-level developers are assuming these responsibilities. The responsibilities of a senior/staff engineer while remaining at mid-level salaries. A gap of up to $400K a year.

It's even worse in the UK, about 10x worse but I'll save it for another post.

Broken Leveling System

The entire SWE hierarchy was built on stable assumptions: L4 does this, L5 does that, L6 coordinates at this scale. Those definitions took decades to stabilise across the industry.

AI agents broke them in six months. Cool.

When an L4 is doing L5 work plus L6 coordination, what does "L4" even mean anymore? When job postings demand "proficient in agentic development" but the comp bands haven't moved, what exactly are they paying for?

What does career progression look like when a mid-level already has staff-level responsibilities? What's the incentive for your manager to promote you if you're already doing the work for free? What does a junior even look like in this market?

Companies are moving fast on the tooling—agent teams, multi-agent coordination, autonomous workflows. But what's being done internally to update leveling, compensation bands, and job descriptions to match this new reality?

The longer this mismatch persists, the worse it gets. AI labs are shipping faster than corporate structures can adapt. At some point, the gap between what the job is and what the level says it is becomes too obvious to ignore.

Funnily enough the AI labs have completely understood and adapted to this, you scroll through their job listings and every single role is either "Member of Technical Staff" or "Forward Deploy/GTM". They've gone the way of ambiguity. It makes sense when internally roles are constantly changing:

"With many researchers and engineers at OpenAI describing their job today as being fundamentally different from what it was just two months ago."

We need to reshuffle the hierarchy now. Define what each level means in an agentic world. Update the comp bands to match. Give people clarity on what they're actually being paid to do.

The Worst Case in a Tough Market (Vent Sesh)

This could turn out to be the worst example of class drift/capitalist disparity/worker shafting in history.

The market sure seems to be reflecting this. MAG7 is at all time highs, the monopolies continue to monopolise. The disruptors get folded into existing monopolies through investments, acquisitions, deals etc.

"But you're shareholders! You benefit from the productivity gains through equity!"

Bar the MAG7 everyone else is getting utterly hammered, especially in recent days. Most tech/SaaS companies are down 40% in four months because the market believes agentic coding will consume their entire moat.

So realistically you're taking on more responsibility, racing against AI labs praying they don't end your existence with a decision to clart your entire market, and your equity is in freefall. LOVE TO SEE IT!

The SWE landscape is changing so drastically it's very difficult to keep up. Now a tool/model comes out that you must learn and integrate otherwise you fall behind. Back in the day it was like "Oh new React framework/Python library, I'll learn that over the weekend. Even if I don't it doesn't threaten my livelihood and the existence of my progeny."

Things are changing so fast you don't have time to stop and think about the broader picture. Especially with layoffs, hiring freezes and weakened job security. It feels as though the engineer is not in a position to negotiate. At all.

The Silver Lining (Happy Fun Good Case)

Pause for a second and think about what your job has actually become. AI has transformed everyone into mini-CTOs. It's easy to get disillusioned and gloomy. But these tools have been made available to everyone. We're in a period of hyperacceleration + innovation and you have more autonomy now to create whatever you want than ever before. Chin up lad. You've likely learnt more this year than ever before and continue to learn more every day. Who knows, maybe this increased responsibility results in increased confidence increase entrepreneurship and eventually the greatest innovations.

But for those that want to go to work, be paid a fair wage for their efforts, skills and responsibilities. It is important to monitor how this landscape is changing, and determine your standing within it.

"The world will ask you who you are, and if you don't know, the world will tell you."

- C.G. Jung

Sopranos Quote - Tony Soprano... hahahahaha duality of man


What do you think?

I'd love to hear your perspective. Genuinely. Take a minute to share your thoughts:

A lot more is expected from software engineers now than 1/2/5 years ago.

These expectations often come from those with non-technical backgrounds (PMs, business, HR, management).

Non-technical people are more enchanted by AI lab promises of productivity gains and business radicalisation.

Responses are stored privately. Only aggregated percentages are returned via a server-side function - individual submissions are never exposed.


If you're wondering why my tone is a bit different and doomcore, blame the stock market. :)

Honestly this was a hard post to write and especially have NOT end on a bad note. I'm by no means anti-AI. I'm turbo pro-AI. But someone needs to think about the consequences eh?

Feel free to email me if you loved/hated/related or for anything really, I reply to all -> [email protected].

Cover: "Workmen at Carrara" by John Singer Sargent, 1911