Computers are an inherently oppressive technology (2022)
devever.netA few weeks ago I was taking the train but before boarding I noticed two police officers doing their daily patrol and started chatting with them about a crazy lady at the top of the station's stairs. They said they couldn't do anything because it was not their job to take care of the insane to which I couldn't really say anything other than "I guess that sounds about right". The train I was going to take started to depart but one guy managed to stick his arm out to keep one of the doors open. The conductor saw this and refused to re-open the doors even though the guy seemed to be visibly distraught and maybe even in pain. He was trying to help me catch the train but the conductor had other ideas.
It's not just the machines that are ruthless. The operators also take on attributes of the machines they manage and in the process inflict willful cruelty on others through their control of the machinery. The most obvious example of this is, of course, war and the technology associated with its execution. So this essay is right, machines/algorithms are ruthless but it is the people that use them to inflict pain and suffering on others that makes the whole thing into a grand tragedy.
> The operators also take on attributes of the machines they manage and in the process inflict willful cruelty on others through their control of the machinery. The most obvious example of this is, of course, war and the technology associated with its execution. So this essay is right, machines/algorithms are ruthless but it is the people that use them to inflict pain and suffering on others that makes the whole thing into a grand tragedy.
This mirrors my experience. About six months ago, I had an accident on the metro in my city, where my leg slipped between the train and platform while deboarding. Not only did passers by not help, the "operators" stood idly by. While I am lucky to have escaped with "just" an ACL tear, I would have lost my leg that day if my friend hadn't been there to pull me out.
Glad to hear you managed to get out of the incident alive.
> The conductor saw this and refused to re-open the doors even though the guy seemed to be visibly distraught and maybe even in pain.
I'm not generally a litigious person (at least I hope not), but that sounds like a law suit...
It's one thing to refuse to open the doors just to let a latecomer on board, but it's quite another for a person in a position of authority (and I would argue with a duty of care) to harm someone or put them in danger like this.
Indeed. And that is why we naturally fear new technology, if we have enough sad experience of human nature or reading of history-- because we fear it will give more power to those who will use it against us, a fear that is very justified by the evidence of what has happened in the wake of new technologies spreading over the last 10,000 years.
Charlie Booker, of Black Mirror fame, said (paraphrasing): it's not technology that sucks -- it's people.
No it's not. People are great. The problem is that we are not OK. When can't help ourselves, much less others. A lot of us are sick and don't have help. We are silently succumbing to the suffering of an increasingly dehumanizing environment. Machines should be beside us, to help. Not between us and stop us from fully interact with each other.
> The problem is that we are not OK. When can't help ourselves, much less others. A lot of us are sick and don't have help. We are silently succumbing to the suffering of an increasingly dehumanizing environment
ok, maybe? i don't think that's unanimously true
> Machines should be beside us, to help. Not between us and stop us from fully interact with each other.
and who puts them there and uses them as such?
who comes up with things like DRM'd juice packets?
people anthropomorphize machines:
No."the car rolled over and crushed..." or "the train doors closed in his face"
not as romantic, but closer to the truth."the reckless driver rolled his car..." "the uncaring conductor closed the doors in his face" "the bureaucratic corporation and engineers designed the train leaving no way to override the doors by the operator"machines, computers included, are simply powerful levers. they amplify forces applied - good or evil or dehumanizing - by their users. nothing more.
not all people are good, and when those people use powerful machines to amplify their intentions, it can have enormous negative effects.
> ok, maybe? i don't think that's unanimously true
I get your point. Some people seem to be thriving. Yes, but those don't suck. The ones that are thriving and do suck are, in the end, succumbing to the increasingly dehumanizing environment.
> and who puts them there and uses them as such? > who comes up with things like DRM'd juice packets?
We do. People do. Yes. Those people suck. They are not ok and are clearly succumbing to the increasingly dehumanizing environment.
> people anthropomorphize machines:
Oh yes they do. Excelent point! In the end, it's all people. I totally agree. We cannot let responsibility die as an orphan. Those people suck, but ... well, you know, they "suckumb" ;).
> machines, computers included, are simply powerful levers. they amplify forces applied - good or evil or dehumanizing - by their users. nothing more. > not all people are good, and when those people use powerful machines to amplify their intentions, it can have enormous negative effects.
Yes. Without any doubt. We need to counteract that. Let's work together. We can!
"It is inevitable that we face problems, but no particular problem is inevitable." - David Deutsch - The Beginning of the Infinity.
>People are great. The problem is that we are not OK.
Come again? Seems like the latter disproves the former. After all, if people are so great, why would they build a machine that proliferates suffering?
What kind of man builds a machine to kill a girl? A smart one.
"Smart" does not imply good, nor virtuous. Merely capable.
To maintain the level of throughput we expect and need, trains have to run to a rigid schedule. If things were re-jigged so that trains could leave a bit late for some reason or other, the resulting reduction in throughput and corresponding passenger miles would outweigh your inconvenience. After all, trains drove the mass adoption of accurate timekeeping.
This is accounted for in my "panoptic computronium cathedral"™ with reconfigurable and individual tube systems for transporting individuals at high speeds [1]. The AGI decides who needs to be where and for what purposes and then reconfigures the tube system to accomplish the task. It is much more efficient and does not require any time keeping devices.
Sounds very similar to the existing road infrastructure.
The tubes prevent accidents, that is the main innovation.
You’re still going to hold up other people if you stop to chat or stick your arms out of the tube.
The AGI takes care of that as well. The tubes are reconfigurable so you can't be where you're not supposed to be according to the AGI's plan. Even if you're chatting the tube will suck you up and transfer you to the required destination.
I think this is a phenomenon we observe when people become parts of machines. The cops, the conductor, a bureaucrat at a paperwork filing office, they all take on this position of neutrality, of deference to the system in which they function. Any autonomy they exercise becomes an unpredictability in the smooth operation of the machine, one which they believe is more good than bad and a worthwhile trade off, so in situations where the answer is not clear, the mechanics of the machine must be maintained. Truly they are cogs in a machine.
In a McLuhanistic interpretation,
"We become what we behold. We shape our tools and then our tools shape us"
I don't feel this kind of "ruthlessness" is particularly inherent to computers.
It seems to me an inherent property of the non-human universe, i.e "everything but us" - gravity is extremely unforgiving if you happen to step off the wrong thing, a sharp rock can slice you open with a moment's inattention. If you find yourself in the wrong environment you will die of thirst or hunger or asphyxiate. Nature is not "kind". Eventually even our bodies turn on us.
Human kindness and judgement have a very limited sphere of influence, it just (rightly) feels huge because it's at the center of human life.
Nature is not kind, but neither is it vindictive; the wild world outside humanity will generally treat you with mere indifference. Conversely, there is nothing on earth capable of inflicting more sustained, intentional harm than a group of humans with hearts full of righteous certainty.
I would make a stronger and more explicit claim. Even speaking of "nature" as "indifferent" is wrong, because it is a category mistake. Indifference presupposes a capacity to care, so a claim like "the rock is indifferent" isn't just false, but nonsensical. A rock falling off a cliff onto a person below is just a rock that has completely accidentally landed on a person below.
Why do I emphasize this so strongly? Because popular science, as it is wont to do, often sacrifices correctness and intellectual substance with tawdry emotional appeal and sensationalism. To say "the universe is indifferent to us" has an emotional force than the banal reality of the situation does not. And this leads to intellectual confusion and a distorted view of reality, because a category, through emotional conditioning, has been falsely attached to reality.
This article speaks of ruthlessness of the machine for 99% of the time, and then with absolutely no logical reasoning, says at the end that ruthlessnes is inherently oppressive. Because it has the power to be it. In reality, it's the opposite: it's human flawed moral compass that makes a machine a weapon. Until we understand and address this, we will find no peace.
We will never understand and address this as a group as long as the underlying environment is the uncontrolled capitalistic one in which the ideas and technologies that are selected for (evolutionarily) are the ones that net the greatest profit in the shortest-term without regard to long-term consequences. As long as we have this and the ability to abuse the commons, we will never deviate from our abusive and violent use of technology.
I truly believe the only solution is to bring down the current system, which includes the technology that enables it.
And replace it with what? The scary part of suggestions of “tear it all down” is how do we know that the replacement will be any better? In the grand scheme of things the West is in a state of abundance and luxury. Much more so than the communist experiments ever achieved. Additionally global poverty has been trending down for decades. While global literacy has greatly increased.
The key strength of a free-market seems to be that it assumes people will act in their own self interest and it creates a space where we can a get a roughly ‘win-win’ situation. That while you act in your self interest both you and the community are rewarded. So starting a bakery would give you a financial reward and others baked goods at a competitive price. Assuming that changing the system will force people to stop acting in their self interest seems to be how alternatives go wrong.
This.
"Things are bad. They shouldn't be like this." OK, could be. "We should destroy it all." Yeah, will that make things better, or worse. To not have the current set of problems doesn't mean you have no problems. You can have far worse problems. So before we agree to tear it all down, you have to convince us that 1) your replacement will actually be better, and 2) you have a realistic plan to actually bring that about. Otherwise, you're just one more vandal, destroying but not building.
I think anything that ends ecological destruction and rising CO2 levels will be better than what we have now. We will just have to be brave and work within such a new system instead of being scared children surrounded by our technology.
We have solved nuclear fission power, created vaccines for COVID, and went to the moon. Surely we can go back to a world of greater sustainability than now.
Plenty of older societies had more strict regulations to soften the devastating effects of the free market. At the very least we should reinstitute customs that prevent unfettered economic growth and more sustainable population levels.
If you "bring down the current system", do you think what results is going to be less destructive to the environment? Or will 8 billion people all do whatever they have to in order to survive, regardless of the damage it causes?
You have to have a realistic replacement for the current system, and a realistic plan to get there. Without that, bringing down the current system won't lead to something better, but to something worse.
We have a model for what happens when you “bring down the system”. Do you think it would be better if we were all living in the equivalent of Somalia? It can be very difficult to engineer a stable revolution and all too often it ends up in chaos.
> We have solved nuclear fission power, created vaccines for COVID, and went to the moon. Surely we can go back to a world of greater sustainability than now.
I'm not sure about that at all. These three things you listed are just science/technical problems, while the latter is a governance problem. Governance is much harder than science and technology.
So you don't actually want to "tear it all down", yes? That would indeed be very stupid. Revolutionaries almost always make things worse, and often has wicked motives. Reformers can make things better.
Unfettered capitalism is certainly worth criticism. A just economic arrangement is one that rests on a sound philosophical anthropology. Of course, a good portion of those raging against "the system" are driven less by moral concerns and more by envy masquerading as moral concern.
It’s just a tool. Like any other, it works for its master. A hammer can be used for good or bad. A knife, a brick, a gun and so on.
The problem is humans. We need people who refuse to govern using oppression. Corporate leaders who refuse to prioritise profit over human rights. And developers who have a moral compass of what should and should not be built.
But why would this technology be any different to prior new technologies…
Exactly. It's not a faceless corporation or say, its IT systems doing good or evil. It's the (groups of) people running that corporation & wielding its resources as tools.
That said: what systems you encounter & how they behave, may tell you something about the ethics of the people that put them in place.
As a consequence: if corporation / system / product behaves 'evil', don't direct your anger at those. Instead, direct your anger (or praise!) at the people who created & manage them.
For convenience, "company Y did Z" remains a valid phrase. As long as you're aware that "company Y" is just a placeholder meaning "people working for company Y".
There is no such thing as a 'just a tool', because we have fundamnetal and basal instincts that cannot be stopped when we operate in a large group. Tools always interact with these basal instincts in predictable ways that cannot be stopped within our modern technological society that has an incredible amount of inertia.
To say that anyhting is 'just a tool' is exceptionally naive and only works in small group settings.
We will never find people who govern fairly or corporate leaders who refuse to priortise profit as long as we maintain and support a system that is conducive to their growth, just as we will never stop the growth of bacteria in a bacteria-rich medium.
Yes, a hammer can be used for good or bad, but it will always inevitably be used for bad in an environment that encourages such uses.
> There is no such thing as a 'just a tool', because we have fundamnetal and basal instincts that cannot be stopped when we operate in a large group.
Empirically that seems to be true.
> Yes, a hammer can be used for good or bad, but it will always inevitably be used for bad in an environment that encourages such uses.
From your previous line that I quoted, that would be any environment that has humans in large groups. Now what do you propose? Short of returning to hunter-gatherers (with the death of about 7.9 billion people), I don't see how you're going to prevent that kind of environment.
That’s why I wrote “But why would this technology be any different to prior new technologies…”
We don’t have the environment to change it
Machines have no intent so they can't be malevolent or virtuous. The oppressive nature of all current machines is due to it's design. A design that was created by human beings. Machines and computers make great servants but horrible masters so if you are in a position to deny a reasonable request because "the computer says no" than a human being is at fault. Blaming a machine is misplaced blame.
Runaway sophomoric anthropomorphism, paranoia, and reductionism. It would be clearer to identify the legal, moral, and ethical responsibilities of those who are responsible than to blame technology. The technology is amoral, but its operator are not. Abdication of ascribing blame by diffusion of responsibility is the greater "evil".
No. Machines are neutral amplifying human nature and activities.
Humanity once Ressource starved becomes inherently oppressive as art and experiments compete with human mouths for survival. computers just amplify that retardation once it occurs.
VCs and startups trying so hard to squeeze money out of gullible buyers is different from the broad realization that machines being inherently oppressive.
While the article acknowledges that another way to look at computers/machines is to call them "incorruptible", it sticks to "ruthless" and "oppression" throughout.
Instead, I'd use a different, neutral term like someone used for nature here: they are indifferent to our emotions.
But that's about right: so are shovels, or meteorites, waterfalls, or a rock falling off a cliff. The main difference is that most of those inert objects act in accordance with natural forces, whereas machines have some unnatural movement (like sideways with train doors).
The other difference is that we have introduced many more of such inert objects "acting" into our environment, but we have been doing that long before we could build sophisticated machines and computers (ceilings did fall, statues and bridges collapsed, animals killed and hurt their stewards...).
As such, I would vehemently disagree: prescribing any moral direction to objects can only confuse and introduce FUD (has been done throughout history). With machines, we actually have an ability to choose the behaviour (adding sensors to train doors is pretty simple).
The fact that we don't is purely our choice, the same way we teach our kids by letting them fall, get a bad grade or experience anything negative — not because we don't love them. Do you feel like delaying a train of 1000 people because you are slightly late is ok? Would you go and thank everyone or apologize to anyone affected on the train — if you were not ruthless, you would, right?
I don't really believe the above, I am simply showing how easy it is to turn this on its head.
Is a difference perhaps that computers tend to operate even when someone isn't directly using them in a way that most tools do not? Something like a shovel or even a forklift only operates while someone is there using it. Something like a sign or a barricade "operates" all of the time, but an individual can often move it if they need to (e.g. if a "road closed" barricade was poorly placed such that it blocks an unrelated road).
On the other hand, when "the algorithm" messes up there's often no one operating it to talk to and no way for the effected individual to bypass it. In that sense, perhaps computers behave like "entities" in a way that other tools do not?
Somewhat, but you get the same with animals we've used as "tools" for a long time (horses, donkeys and cows spring to mind).
There are plenty of things that can kill or hurt us that we've made that have no active operator either (I mentioned buildings and roads/bridges collapsing, and one could even place a shovel on a pile that slips or falls off a truck and hurts you; or using a wire gauge that's too thin for electric current it carries; or...). Lack of care (or expertise) in whatever humans construct or build can harm you without having an operator or any automation.
People with power and influence are operators of oppression. For eg. logistics. The work of delivery guys gamified and optimized for the minimizing the cost. Its people who are creating these systems and not the computer. Computer can help create cars, vaccines etc. as well.