From: Stathis Papaioannou (email@example.com)
Date: Mon Mar 31 2008 - 05:10:07 MDT
On 31/03/2008, Lee Corbin <firstname.lastname@example.org> wrote:
> > But is it possible that your brain is partly zombified right now,
> > with the normal, conscious part systematically deluded into
> > thinking that it can see when in fact it is completely blind?
> I'm not sure what you mean by "normal, conscious part".
> One possibility is that only my speech centers are being
> authentically computed while the rest of my cortex is
> being merely looked up. I would be having very, very
> little in the way of experience, of course.
I guess it is possible if only the lower motor neurons controlling
speech are normal and the rest of the brain is looked up that you
could be a complete zombie, since presumably lower motor neurons are
not involved in consciousness, or at any rate not very much
consciousness. But in the thought experiment we could identify every
neuron from the retina up that is involved in visual processing, and
replace these with looked up analogues. You would then have no visual
experience at all, but you would look at a picture of a dog,
accurately describe the picture, write an evocative poem about it, and
truly and honestly believe that you are looking at a picture of a dog;
while in fact you are completely bereft of any visual experience. If
you believe that this is possible, then you have to admit that you
have no evidence right now to support the theory that you are actually
seeing this email. You think you are, but if some medical
investigation shows that you have been blind for the last twenty years
without knowing it, you won't be able to dismiss it as absurd. And,
given that you have always enjoyed (or thought you enjoyed!) your
visual experiences, you might seriously consider an expensive and
slightly risky operation to restore your sight.
> > If you allow the possibility of partial zombification then you have to
> > allow that not only can an external observer never know a person's
> > subjective experiences, but the person himself can never be sure that
> > he is having the experiences he believes he is having. This seems a
> > high price to pay in order to maintain that a LUC cannot be conscious.
> You are entirely correct. It is a high price. I wish I knew of any
> other way out. But it does not seem to me to be so high as the
> price paid by you and others, namely, that only a tiny bit of your
> consciousness is associated with a certain body in a certain city
> on planet Earth, and that the vast, vast, vast majority of your
> experience takes place in other objects, or---in another matter
> of speaking---doesn't take place at all, but only corresponds
> to prior existing patterns existing only in Platonia.
At least you see what you think of as an absurdity and adjust the
theory (functionalism) to remove the absurdity. Most philosophers and
cognitive scientists dismiss the apparent absurdity and simply carry
on as before.
> >> Now, let me phrase my answer using an experiment so that there is
> >> no mistaking my meaning. Question: would you prefer
> >> (A) to be tortured for an hour in the old-fashioned way
> >> (B) for records of such an hour merely to be retrieved from
> >> a galaxy far, far away a long time ago in which you were
> >> tortured just the same, and merely the states for that hour
> >> interval brought to Earth and at the proper moment merely
> >> looked up?
> > I guess it depends on whether the looking up of the records repeats
> > the experience which is what we are debating. If the recording were
> > accurate down to the atomic level then yes, I think it would reproduce
> > the same experience as the original.
> To be precise, an entire state is swapped into the location where you
> reside once every trillionth of a second. None of the states is causally
> connected to any of the others: they could even have been retrieved
> from extremely disparate and separated areas of intergalactic space,
> by some random process that just happened to find patches of dust.
> (Those, by an amazing coincidence, do happen to be the same states
> that are now being computed, a trillion to the second, in the city where
> you live, by the ordinary metabolic processes (computations).)
> So can you recapitulate and answer (A) or (B) for sure? Thanks.
> For of course, if your answer differs from mine, I have further
In that case, (A) and (B) would be equivalent.
> > The argument above proves that a functionally perfect simulation of a
> > brain must be as conscious as the brain *unless* partial zombification
> > is possible. The possibility of mind uploading is something else you
> > might have to give up if a LUC [looked up computation] cannot be
> > conscious.
> Not at all. Uploading is entirely orthogonal to the argument we're having.
> The only feasible way to upload someone is to entirely simulate with near
> perfect fidelity the actual computations his or her brain is already achieving.
> Whether I or some part of me could in theory be "looked up" now is
> exactly the same issue as to whether some part of me could be "looked up"
> after I'm uploaded.
Someone could claim that a perfect simulation might behave like the
original, but it might be unconscious, or at least differently
conscious. This the layperson's usual response to the idea of mind
uploading. I claim that the argument whereby if part of your brain is
replaced by the simulation you could not possibly notice anything had
changed *proves* that the simulation must be conscious in exactly the
same way as biological tissue. But, according to you, this argument
proves nothing since it is possible to feel exactly the same while
undergoing a process of gradual zombification. If that's so then there
is no *proof* that an upload will be conscious in the same way as the
original. You might think that it sounds reasonable, but ultimately
you have to take it on faith. You also have to admit that you might be
at least partially a zombie right now, since feeling that you're all
conscious does not count as evidence that this is in fact the case.
-- Stathis Papaioannou
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT