Re: Copying nonsense (was Re: [sl4] Uploading (was : goals of AI))

From: Stathis Papaioannou (
Date: Sat Dec 05 2009 - 22:08:33 MST

2009/12/6 Filipe Sobreira <>:
>>>If a copy of you is made by some means then there will be two people who
>>> claim to be you. If one of them is killed then one of them is killed.
>>>I don't understand why this is such a difficult concept.
> It isn't
>>>You ask which copy will be you? What will you experience? These are not
>>> well formed questions because "you" is not a well defined concept in a world
>>> where minds can be copied.
> Change 'you' for 'observer'. A copy of an observer is made by some means,
> then there are two observers who claim to be the same people.
> Defining 'same': sharing _all_ measurable qualities
> They're right in their assertions? Yes, in a sense, but only for a _third_
> observer. Under their own perceptions, they're not the same person. They not
> share all measurable qualities. No matter how hard you argue they do, no
> matter how you naively troll me telling you don't believe in the soul: One
> doesn't need to believe in the soul to verify that if they are two
> _observers_ then they're also part of the experience, and they also can
> measure their own qualities. By the time the copy is created/instantiated
> both the copy and the original (doesn't matter if they know or not who's
> who) will be able to measure their positions regarding the other (a
> measurable quality) and also they'll be able to tell themselves apart, just
> by poking themselves and verifying the other don't feel nothing. Both of
> them will retain their own 'sense of self'. If you don't believe or don't
> know what is a sense of self, imagine it as the reflexive mental process
> that informs each observer he is a separate being. It is what makes people
> refer themselves as 'me'.

So which one is really "you"? We can know and agree on all the
empirical facts and still disagree on this. And this is because, as
Matt points out, "you" is not a well-defined concept if minds can be

> But the reflexive processes were copied to the other individual too!!! so
> they must be the same right? wrong. And that's what constitutes the so
> called 'hard problem': both copies are neuron by neuron identical, so they
> should be the same, but they aren't (!) since them, as observers, can
> distinguish themselves as being different from each other, measuring their
> property of me'ness and verifying this property (perceived as being 'me')
> isn't the same for them.

The two copies can be the same, but not one and the same, since for a
start they occupy different spaces.

> In these mental experiments, Extropians and transhumans in general like to
> ignore the fact people are observers. Lots of people around, and that
> includes the most famous _troll_ of this list, like to believe themselves to
> be computer programs running in the brain. This is a quite extraordinary
> delusion, without any reasonable foundation. It is a blind dogma, not worse
> of those held by many religions, Abrahamic or not. Just ignorance and hubris
> disguised as intelligence and reason.

The claim is not that the mind is a computer program running in the
brain but that the brain can be emulated by a computer program

> You guys need a reality check sometimes: check out the essay on H+, the
> transhuman magazine:

That article is full of non-sequiturs and straight out factual errors.
For example, it is claimed that neurons in the brain are not replaced
through life. They are replaced, but gradually over time, unlike
dividing cells. A neuron is like a car where you are gradually
replacing parts as they wear out, so that eventually none of the car
is original. A cell in the gut epithelium is like a car that is
scrapped when it develops faults and replaced with a new car from the
factory. In both cases you end up with a car that looks like the
original but isn't, and the same happens to the human body over time.

Stathis Papaioannou

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT