Settings

Theme

Chinese room argument

en.wikipedia.org

11 points by jermaink 8 years ago · 9 comments

Reader

cousin_it 8 years ago

For everyone who considers the Chinese Room argument obviously wrong (like me), here's two versions that are much stronger and might still make you uneasy:

In Greg Egan's "Jewelhead" stories, every person gets a computer implanted in their brain at birth, which gradually learns to imitate the input-output behavior of the biological brain. At some point they switch to the jewel full time and throw away the biological brain, becoming immortal. That's seen as a fact of life and people don't question it much.

In one of Wei Dai's nightmare scenarios, we ask an AI to upload humans in an efficient way. Unfortunately, since humans can't introspect into the idea of "I'm conscious" very deeply, the resulting resource-optimized uploads just have a handful of hardcoded responses to questions about consciousness, and aren't in fact conscious. Nobody notices.

Of course, both cases are problematic only if you can "optimize" a human brain into something else, which would mimic the same input-output behavior without being conscious. The trouble is that we can't rule out that possibility today. Humans certainly have a lot of neural circuitry that's a side effect of something else. Some of it might get optimized out, the way a human in a sealed room can be optimized to nothing at all. To rule out a "Disneyland without children" scenario, wishful thinking isn't enough, we need to properly figure out consciousness and qualia.

  • Teckla 8 years ago

    For those hunting for those "jewelhead" stories:

    https://en.wikipedia.org/wiki/Axiomatic_(story_collection)

  • dTal 8 years ago

    In Wei Dai's "nightmare scenario", what of value is lost?

    If "consciousness", whatever that is, is completely undetectable by external means and has no effect on human behavior, I think we can safely ignore it.

    • cousin_it 8 years ago

      That seems like a misreading of the scenario. Consciousness certainly affects behavior, like making me say "I'm conscious". The question is whether the same behavior could be reproduced by a more efficient program that isn't conscious, and if yes, how do we make sure uploads are conscious.

      • dTal 8 years ago

        If the same behavior (including debating consciousness in internet threads!) could be reproduced by a more efficient program that isn't conscious, how do you know you are conscious?

        Come to think of it, these threads do get a bit repetitive - are we sure this hasn't already happened? ;)

        • cousin_it 8 years ago

          Some of my internal experiences are hard to express in words, like how the color red looks. It seems overconfident to claim that such experiences don't exist or don't matter.

          Consider a person who is completely committed to keeping some secret (e.g. that they saw a UFO). Their input-output behavior is the same as a person who doesn't know the secret, but the internal experiences are different. Do you think the AI in charge of optimizing uploads should be free to discard such differences?

dTal 8 years ago

The most famous example of an entire category of fallacious arguments about consciousness:

  1) construct a hypothetical brain on an implausible substrate
  2) note the implausibility of the substrate
  3) therefore consciousness is magic QED
See also: "China brain"
bwasti 8 years ago

From the article, in "Replies":

The fact that man does not understand Chinese is irrelevant, because it is only the system as a whole that matters. Searle notes that (in this simple version of the reply) the "system" is nothing more than a collection of ordinary physical objects; it grants the power of understanding and consciousness to "the conjunction of that person and bits of paper" without making any effort to explain how this pile of objects has become a conscious, thinking being. Searle argues that no reasonable person should be satisfied with the reply, unless they are "under the grip of an ideology".

  • dTal 8 years ago

    Searle's note applies equally to an "ordinary" brain, which is just a collection of ordinary physical objects (atoms).

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection