Legally ban certain autonomous LLM-based AI agents, or risk societal collapse?

8 min read Original article ↗

I don’t even have time to blog, I have a million other things I need to be doing. I apologize in advance if this isn’t well written, I’m not a writer, nor do I want to be one (I also intentionally didn’t use AI for this, almost as an act of defiance).

I work as a Director of Engineering for a startup that’s doing pretty well, and also have consulted with private equity companies in the tech space. I’m writing this because I have kids and I’m a bit nervous about what’s coming. I’m also writing this because I’m hoping someone will rebuttal with an answer that makes me relax and realize everything will be fine. But I’ve come to a realization that I can’t dispute:

We have to institute some form of a legal ban on the commercial usage of autonomous, externally-mutative, LLM-based AI agents or our society as we know it will collapse.

Trust me, I know how crazy this sounds. I have just been racking my brain and can’t figure out a way around it. And we can discuss the exact terms of what this means later (largely I think I mean banning AI from modifying the outside world on a fully automated basis using LLMs, but people smarter than me would need to think through the logistics of this a bit more and that’s a much deeper discussion).

I’m also trying to be fair about the fact that in human history, we’ve usually don’t restrict technological advancements. I do in fact think the emergence of LLMs themselves is a benefit to society and should continue to be used to enhance humans in their jobs, not to fully replace them. Using them to act fully in the place of humans at the scale that we’re approaching has the power to destabilize society as we know it. And despite the fact that I’m aware how amazing a life driven almost entirely by AI might be, I’ll take a bit of inconvenience if it means a functioning and stable economy.

I didn’t panic at all during Covid, mainly because I usually think 99% of the time things will just work out and people will just adapt to change and continue with life. A small portion of people will be affected, but by and large most people will continue with life as usual, albeit a bit different. This was true of the dot-com crash, 9/11, the 2008 housing crisis, Covid, and so on.

At risk of sounding cliché, I’m sorry but this time is totally different.

The fundamental fabric of society is about to change. For you, for me, for everyone. And drastically. Why? We are approaching a time where companies will no longer need a significant number of humans to operate on a large scale.

No one is safe from this, you will be impacted either directly or indirectly. Obviously if a significant portion of the workforce is no longer needed, they don’t have the income to spend at companies that are relatively safe from AI disruption. I’m pretty sure profits in many human labor-required industries will plummet without a healthy middle class to buy their products and services. This ripple effect is one of the things that scares me the most.

If your work is not extremely hands-on or strictly requires in-person labor, you will soon be expendable. And even hands-on labor is susceptible eventually as physical automation improves. But a significant amount of labor in today’s workforce can be conducted almost entirely remotely.

The main rebuttals I’ve seen are “well there’s nuance to my job that it doesn’t understand” or “it doesn’t know how to contextually architect, review, or approve things in the way that I do”. But it will. Obviously all jobs won’t be eliminated, and yes many new jobs will appear, but I think enough human labor will be eliminated to fundamentally break society in a way we haven’t seen.

Some people aren’t fully understanding: with the emergence of MCPs, you can build an agent to do almost anything. And the mind-blowing part is, we’re still in the beginning. I can’t emphasize this enough. Someone will say, “but it can’t do <insert random digital work task>.” But it will. So please, before you make that claim just consider: what if it could? I’m sorry but for the vast population, if your job doesn’t require an in-person presence, it just isn’t as complicated or nuanced as you think it is.

So assuming that new state of the world, in the future, do we all just become agent trainers writing markdown skill files? What happens when they are relatively stable and don’t really need more edits. Then what? Can these markdown writing positions sustain a whole population?

That’s why this is different to me. Many of us have heard the story of how years ago, elevators within large buildings used to have an actual human operator, whom was eventually automated out of a job due to technological advancements. We all accepted that with the common recourse of “well he should just treat this as a time to upgrade his skills and move to newly created employment positions within the workforce”. The problem is that in this new society we’re approaching, many companies will just not need humans for an unprecedented percentage of their business operations.

Someone will respond, “oh, your job will just change to <new job>”, but why would the company not just eventually have an agent do that job as well? As the company optimizes for shareholder value, and they realize that same job can be automated for cents on the dollar, you are now expendable. Someone, somewhere is probably automating an agent to do your exact compute-based job literally right now (including output verification checks and other preventative measures against hallucinations).

Maybe the answer is that all the displaced technically-skilled workers become plumbers, electricians, or surgeons etc. Is there enough hands-on work to sustain all of society, and for how long? I don’t know.

And do what, exactly? UBI is a compelling option, but the primary concern for me is the potential human identity crisis. Work is a form of social control, it just is. I am nervous of a world where human labor is no longer required for a huge population of people, just because of the latent human desire for identity and purpose. For as many people I know that would absolutely prosper in a world where trading time for income was no longer a requirement, I know a fair portion that would likely lose themselves without a daily required directive. I can concede that maybe corporate labor shouldn’t be the primary means of deriving purpose, but right now it definitely is.

An alternative is for the government to provide a surplus of state and federally funded positions for displaced workers. But historically, market-driven companies have just been better providers for labor employment positions, strictly due to the premise of:

  1. I have an idea

  2. I employ humans (either directly or indirectly) to help me accomplish said idea

  3. I profit

This strategy has lasted since the beginning of time, mainly because it isn’t as susceptible to the whims of political fervor. The idea that the availability of these newly created government positions may depend on the (ir)rationality of the whichever administration is currently in power — and not on actual economic markets — is not a great answer, as we’ve seen in recent years.

And let’s not forget, to accomplish either of these options you would need to heavily tax corporations that use these autonomous agents, or the platforms that provide the agents themselves. And given their potential lobbying power in this new era, good luck with that.

Get ahead of the curve. Recognize that companies are not charities. If they have the ability to commercially deploy autonomous, externally-mutative LLM-based agents on a large enough scale, they will replace human labor wherever economically feasible, and it will create a snowball effect that will destabilize our entire economy for years. You can either accept that you are about to enter a world where you are expendable due to your career’s worth of knowledge being easily adapted into an agent, or work to enact some form of legislation to prevent this from happening. Or we all go to electrician school or whatever while we wait for our government to institute an unprecedented tax reform.

Hopefully this will at least encourage a discussion somewhere about how to avoid this impending future. Or please, convince me that I’m wrong, and like the other tumultuous times in history, everything will be just fine.

Thanks for reading. If you think I’m exaggerating, here are a few resources that may change your mind:

https://fortune.com/2026/02/11/something-big-is-happening-ai-february-2020-moment-matt-shumer/

Discussion about this post

Ready for more?