AI is too general a term to be useful, but it’s the term I have to use because that’s what we’ve decided to use. When I talk about AI, I mean exactly what your gut — today’s gut, not the gut of some guy 50 years in the past or future — thinks when someone says AI. I’m a product of my time so I must use the products of my time.
I think AI is very cool and could be a powerful tool. I’d have to be blind and tasteless to think otherwise. But I remember the exponential jump from indecipherable spaghetti to clarity. We might still be in the middle of that jump. And I don’t know when, perhaps it was a year or a month or a day or just a moment ago, but I think there’s something wrong with it.
This isn’t a unique sentiment, but because I’m me and not everyone else it’s unique to me. What’s wrong with AI? The image generation is cool, the videos it can produce now is impressive, the data consolidation and analysis blows me away. What’s the issue? Any qualms I may have with AI now will surely be eased as it gets better and better. Right?
It’s a gut feeling, not a rational feeling, and I don’t like that because gut feelings don’t use modus ponens. Still, I can’t ignore what it tells me. It says something is wrong, so I will do my due diligence and bring forth the reasoning from the mire of instinct.
If you can’t pin down what something is, sometimes it’s best to pin down what it isn’t.
I really don’t care how much water or GPUs or CO2 or child slaves the advancement of AI uses. I think that stuff is bad, sure, but I had this feeling before I knew the extent of these issues. And now that I’m aware of them (barely, if I’m being honest), the feeling still stays. If you are thirsty and you drink something that does not quench your thirst, you did not drink the thing you were thirsty for. You must keep searching for water.
Some argue that AI can’t produce art. They work with the definition of art to say it requires a sender and a receiver with a message, and AI cannot fulfill these requirements because it does not transmit a message.
There’s a more eloquent way to put the argument, but again I don’t care about it. Even if I believed that argument and no longer thought AI could produce art the feeling would remain. Even if I didn’t believe the argument, that AI could produce art, that artists would one day be replaced by a superior art AI, the feeling would remain.
It’s not about art, even if that is a massive issue.
Lots of people use AI. Memes made by AI float around online. Some are really funny. I’ve seen plenty of silly cat AI videos, animals shooting guns on doorbell cams, monkeys being sucked up into tornadoes, dogs using hydropump on unsuspecting old ladies.
Lots of people use it for research, to produce cover letters, to do their homework, for study, for quick information searches. I’ve got friends who use it to streamline their workflow, automating things that would take 40 hours by hand to be finished in seconds. It’s everywhere.
And the fact that it’s everywhere isn’t the problem. I see how it affects people. I had a very heated discussion on whether AI could cause psychosis (which also isn’t the problem). Regardless, it is so pervasive it is impossible to avoid. Any single issue one would have with AI would span all of society.
Even so, this is not the problem.
What is AI? Is it a tool? If a tool, for what? A tool exists to fulfill a purpose. A wrench wrenches, a hammer hammers, a car drives, a pen writes, and AI… what? What is AI for?
In the past, AI was specialized, made for a single job. Really, they were sophisticated algorithms carefully crafted to achieve a specific end. Chess engines found the best chess moves. Search engines pulled up the most relevant web pages. Graphing calculators displayed values on a cartesian plane. Specific. Specialized.
AI is being made not for one specific thing, but everything. If it’s weak at one thing, researchers work to make it better at that thing. It used to be terrible at everything. Now it’s good at many things. Soon, it’ll be built up to do everything perfectly. At least, that seems to be the goal.
So it’s a modern tower of Babel. Or an attempt to fix the consequences of the original tower of Babel. So what? Is that such a bad thing? If AI does replace us in everything from the arts to sciences, does it matter? We made the solver, we are the maker, and the maker is greater than the made. What’s the difference between us solving every problem and AI? Would we be as doomer as we are now if we were the ones who figured it all out, who produced perfect art, who wrote every beautiful sonnet and spectacular fantasy imaginable? Who cares if it’s us or our creation? The honor is ours either way.
This is in the right direction and a good topic for later discussion, but not the issue I seek.
Let’s make it more interesting: we achieve AGI. Not Artificial General Intelligence, but Artificial God Intelligence — something so much more powerful than whatever we thought we could produce. We have distilled God into ones and zeroes or q-bits or whatever, put him in a box, and he does our bidding. He obeys with perfect obedience, and he always does what we ask. Limit him to be a chatbot/video generation model, whatever. He’s in a box and he’s ours. Every problem I’ve listed is accounted for. We’re going for steel, not straw, here.
I have this thing in my pocket. I ask him to make a video, and he makes exactly what I was going for — exactly. I don’t need to tweak anything, and it’s beautiful.
I have a problem. I ask him for advice. He gives me options, I choose one, and it’s exactly like he says. The outcome as described occurs, the advice was correct, the problem is solved.
Maybe this isn’t such a problem. Maybe having everything be perfectly solved forever is a good thing and I’m just a crabby curmudgeon. But compare this created god to the real God. Sometimes we forget, but you can make requests to him like a chatbot: “God, what should I do?” “God, help me with this.” “God can you guide me through this?” These are prompts, but the response doesn’t come as a wall of text or a generative video. We have a God intelligence already, his name is God.
If God, in his infinite wisdom, ignores things or guides us in ways we cannot comprehend, then what are we building? AI is not heading towards an Artificial God Intelligence because it gives us exactly what we want and never anything else. If it’s not God, then what is it?
What is AI and where is it going?
I look at AI with a scowl now, mostly because it’s everywhere and it’s ugly. We play with it like a toy, but of all the things it can be it can’t be that. Some of it looks cool, but once I learn it’s AI there’s a dirt aftertaste at the back of my throat.
I’m frustrated that I can’t spell out the exact reasoned argument for why that is, why I want to avoid it at all cost and want others to stop using it so much. Maybe it’s enough to say we don’t know what we’re building, and maybe we shouldn’t build something so powerful if it doesn’t have the wisdom or discernment of God.
Maybe there’s a reason why there’s only one true God.
