The role of consciousness (Re: The GLUT and functionalism)

From: Matt Mahoney (
Date: Sun Apr 06 2008 - 16:31:00 MDT

--- Lee Corbin <> wrote:

> Matt writes

> > In that case Homer without a hippocampus is conscious,
> You seem amazingly neutral about the meaning of such
> concepts. Are you so sure that mathematics is meaningful
> to any of our inquiries if the real-world meanings of
> any terms used in the math are entirely arbitrary?

Consciousness (in the sense of qualia or having experience) is irrelevant to
the function of the brain or any other computer. However the belief in
consciousness is real and relevant, as it plays a role in the justification of
ethical decisions that people make.

But the definition is arbitrary. I might as well define an entity as
conscious if it has a name. Thus, we name cats and dogs, but not chickens and
cows. The Jetson's have a robot maid named Rosie, therefore it is conscious.
This makes more sense than defining it as a mathematical property of
computation, which invariably leads to absurd conclusions, such as either
humans are not conscious or clouds of intergalactic dust are conscious (as the
last several hundred emails on this thread have argued).

> > But consciousness is just a distraction, a futile attempt to
> > extend our ethics to AI.
> Well, I'm not quite sure why you consider it futile. After all,
> you will sooner or later (assuming that things go well for you)
> be in a position to choose certain outcomes at the expense
> of certain other outcomes. Have you no desire for a guide
> to which actions of yours will be consistent with other
> actions? May I inquire as to whether you consider your
> survival beyond normal human lifespan to be possible, and
> if possible, desirable?

I would like for my decision to upload or not to be based on real information
and not faith, like the suicide bomber who believes he will go to heaven.
However, that is not the case. All I can say with certainty is that after I
upload, there will be something that claims to be me that is realistic enough
to fool all of my friends and relatives. The relevant facts are my fear of
death (therefore I will upload) and belief in consciousness (therefore I
believe that my future upload will be "me" if I preserve my memories).

But every thought experiment I do proves that my beliefs are wrong.

Example: teleportation. It works like this. I step into a booth at point A
and a copy of me is produced at point B. The copy at point A is slowly and
painfully killed by being crushed between the soundproofed walls of the
teleportation booth in a process that takes 24 hours. For $2 extra I have the
option of having the copy at point A injected with an overdose of a narcotic,
making the death fast and painless. But I have teleported hundreds of times
both ways and can't tell the difference. I always come out at point B and I
would rather save the money.

Am I making the right decision? My tribal elders can't help me. They say
that consciousness can't be copied.

-- Matt Mahoney,

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT