Stop Talking to Technology Executives Like They Have Anything to Say

9 min read Original article ↗

I do not enjoy writing about technology. Aside from the shiny, birdlike emotion I get from opening a new Apple product every three years, I do not enjoy using much of modern technology. I view my phone with mistrust and resentment, even as I allow it to eat my time with distractions to recover from a day of ostensibly engineering software in a career I also view with mistrust and resentment.

Unfortunately, sometimes somebody says something so ontologically fucking stupid the usual defenders of intellectual dignity are overwhelmed by the blitzkrieg of obvious affronts to the human condition and can miss the depth of the rot. This time it’s Sam Altman’s response to the question “How do we figure out what’s real and what’s not real?”:

I mean, I can give all sorts of literal answers to that question. We could be cryptographically signing stuff and we could decide who we trust their signature if they actually filmed something or not. But but my sense is what’s going to happen is it’s just going to, like, gradually converge. You know, even like a photo you take out of your iPhone today, it’s like mostly real, but it’s a little not. There’s, like, some AI thing running there in a way you don’t understand and making it look like a little bit better and sometimes you see these weird things … But there’s like a lot of processing power between the photons captured by that camera sensor and the image you eventually see. And you’ve decided it’s real enough or most people decided it’s real enough. But we’ve accepted some gradual move from when it was like photons hitting the film in a camera. And you know, if you go look at some video on TikTok, there’s probably all sorts of video editing tools being used to make it better than [reality]. Or it’s just like, you know, whole scenes are completely generated or some of the whole videos are generated like those bunnies on that trampoline. And I think that the the sort of like the threshold for how real does it have to be to consider to be real will just keep moving. So it’s sort of a education question.

Well it would have been an education question if he hadn’t paid someone to do his Intro to Philosophy homework. I’m not sure what he thinks he’s referring to when he says “literal answers” but cryptography in no possible way addresses the question. In fact, any possible detente between an answer and whatever a signature might achieve is immediately shut down by “and we could decide who we trust” because deciding who and what we trust is the entire problem. A cryptographic signature does nothing but spend enormous amounts of time and resources to move the problem behind another curtain where it will be even more difficult to unravel. So I guess of course that’s where a tech CEO’s mind went first.

Perhaps recognizing that he has wasted his first couple of sentences, he activates his subdermal THC dispenser and proceeds to vibe his way through some vague notions about reality and technology and imagery, in the same way everybody got high and talked about this in college: as if nobody had ever thought about any of it before.

The following doesn’t apply to everybody in technology, but it applies to enough of them: At some point STEM education was the only thing the Olds cared about because of something something Asia, and now we have a couple of generations that are highly educated on paper and comically unaware of the complexity of the world outside of WordPress plugins.

The debate about the relationship of modern technological reproduction of images and their relationship with with reality has been going on since the invention of the photograph. An entire academic career could be dedicated to categorizing the literature that has already been written about it. In fact, a couple of courses could be spent unpacking what constitutes “better than reality.”

Turns out, figuring out what’s real is not easy and Sam Altman is unqualified to comment on it in a serious way. The question itself is almost always a bad choice even in rhetoric. In an interview, the question gets rolled out to pretend the interview is taking place in a bizarro world where a technology executive might have something interesting to add to the debate. Unsurprisingly, they never do.

It’s not the lack of knowledge alone that makes these conversations so tiresome. It’s not even an unwillingness to admit ignorance: it’s the lack of awareness that there’s already a conversation. Evidence of this erupts constantly from improperly stoppered tech workers’ mouths whenever their work bumps up against social issues, and given the frequency of that bumping one is forced to assume a willful incuriosity. Or, at least, a confidence that nobody else did any reading outside comp sci, so a mumbling attempt at stoner epistemology will sound insightful.

The lead up to this question was a quick summation of a recent AI video drama: bunnies jumping on a trampoline. Cute as a motherfucker and faker than that. Its inauthenticity slipped by most people’s radar because if you tell the AI to make the video quality a little shitty it’s harder to tell it’s AI. People’s reaction to finding out it was AI: not great.

Asking why that reaction wasn’t great seems obvious: people felt they were lied to. Clapback to that obvious feeling falls somewhere between “boo hoo you knew better than to trust the internet” and “come on you enjoyed the cute bunnies just be happy loser.”

To the Boo Hoo people: nice try but you do all your banking and get your news online, and I know you do because you typed a whole comment into a social media site. I didn’t trust the internet even before I knew how half of it worked, and I’ll still dish out my social security number if the graphics look professional and it saves me a phone call. A balance of trust and convenience is applied to each situation, exactly like every other single thing in life. To a lot of people, AI is violating the truce of digital representation, and forcing us to become yet even more suspicious of everything we see. This at the same moment the major, clearly-should-have-been-broken-up-monopoly companies are pushing the narrative that if we don’t use AI we’ll get left behind, which is a bald-faced scare tactic to get us to buy into the game so they can paddle upriver long enough to get AI that will let them leave us behind anyway. I don’t think he knows it, but the future Altman sees when he says our sense of reality will “converge” is the one where everybody shrugs and accepts that our access to useful information has yet again fragmented under the weight of the paranoid alienation his ilk keep pumping into the system.

To the Come On Be Cool contingent: yes, bunnies are innately cute to humans for some reason, but one of the more important dregs of joy still allowed us in the modern era is the implicit assumption that when we see a cute or cool thing online, it’s because another human had an experience they wanted to share with us. That is the cornerstone drug of social media that keeps us all hooked despite it being cut with more and more digital PCP every year. That people share things with us purely to get attention erodes that pleasure. People looking for attention for money erodes it further. The bots make it worse. Fake pictures make it worse. Fake videos make it dystopian. Fake videos produced near instantly by AI make it borderline apocalyptic. I don’t think we’ll ever know whether shunting a huge amount of socialization into a digital space was a good or bad idea, because everybody in control of that digital space worked nights for twenty years to ensure that it undercut the foundation of social coherence.

When Altman goes on to compare Marvel movies with fake social media, he illustrates that he truly does not understand the problem he is invested in exacerbating. A Marvel movie cannot lie to us in any important way. It’s the difference between entertainment and documentation: we expect to be misled for the purpose of entertainment, and rightly decry illusion in what is presented to us as documentation. Social media has always muddled this demarcation, to the evident detriment of our faith in any kind of information. If Altman thinks it’s fine to make this worse, he can converge on my balls.

A better question, or at least a question related to these problems, might have been: Now that the individual experience of quantifiable knowledge is predominantly filtered through a shrinking variety of companies and devices, what do we do when there’s no way to reliably map that experience to the shared reality we used to know and love? Altman’s not qualified to answer this either, and nobody in tech wants to address it because it’s another scary and unprecedented thing making a small number of people rich, so we end up with “What’s, like, real, man?”

When Mike Shulman said, aloud and in front of a camera, “I think the majority of people don’t enjoy the majority of time they spend making music,” everybody knew he was an asshole with no interesting friends. I think it’s important to include Sam Altman in this category of asshole. Its members are oblivious to the concept of a world where people want genuine human connection, and to otherwise engage with reality in interesting, even difficult ways.

These people suffer from a severe lack of imagination. Raised to pursue success along a solitary economic metric, they ignore all arts and sciences extraneous to that pursuit. They treat the world outside their interests like a children’s game they’re not really into. Their wealth insulates them from friction so effectively there’s no incentive or pressure for them to develop an imagination, or diversify their knowledge to the point where an imagination might emerge on its own. I can’t think of a better argument for a humanities requirement than a billionaire being asked “how do we know what is real?” and responding with “cryptographic signatures.”

I beg of them: Go for a walk. Whittle something. Read a book with a title that doesn’t start with a number.

Or maybe somebody else consider regulating the insane amount of power allotted people nobody willingly invites to dinner.

Mouse angry. Mouse cut you.