Re: Copying nonsense (was Re: [sl4] Uploading (was : goals of AI))

From: Luke (wlgriffiths@gmail.com)
Date: Mon Dec 07 2009 - 13:56:22 MST


"If you got out, then you must believe in this nonexistent thing."

Actually, in that situation you'd probably be primarily motivated by a
desire not to experience your flesh being burned.

Even a zen master will duck if you throw a molotov cocktail at him. That's
the nature of the flesh.

More generally, re: this entire conversation. Just accept the fact that if
you make a conscious copy of yourself you'll both feel the continuity.

That continuity is itself an illusion. The past doesn't exist, except in
memory. "In" memory - not a spatial relationship.

 - Luke

On Mon, Dec 7, 2009 at 3:09 PM, Matt Mahoney <matmahoney@yahoo.com> wrote:

> "M.>h" wrote:
> > ... sorry, but i do not get the whole problem.
>
> You understand that I can't define something which doesn't exist. When I
> say qualia or self awareness or that little person inside your mind that
> observes the world through your senses, most people know what I mean.
>
> Let me put it this way. Consider the AI program that observed everything
> you did for the last several years until it became so good at predicting
> your behavior that none of your friends or relatives could distinguish it
> from you in a Turing test environment. Unfortunately the building containing
> the only copy of the program is on fire. It is just you and the computer in
> a room rapidly filling with smoke. There is just enough time either for you
> to get out, allowing the only copy to be destroyed, or for you to upload a
> copy of the program to a remote site over the internet with your last dying
> breath. Which do you do?
>
> If you got out, then you must believe in this nonexistent thing. Otherwise
> you would logically conclude that by preserving your memories in a form that
> can be backed up, that you become immortal. Furthermore, you have the
> opportunity to enhance your intelligence and your environment by running on
> more powerful computers and embodied in better robots in the future. Why
> would you ever allow one copy to be destroyed now and the only other copy in
> a few decades?
>
> Sorry for my ambiguous use of the word "you".
>
>
> -- Matt Mahoney, matmahoney@yahoo.com
>
>
> ------------------------------
> *From:* "M.>h" <m.transhumanist@gmail.com>
> *To:* "sl4@sl4.org" <sl4@sl4.org>
> *Sent:* Mon, December 7, 2009 1:58:00 AM
>
> *Subject:* Re: Copying nonsense (was Re: [sl4] Uploading (was : goals of
> AI))
>
> ... sorry, but i do not get the whole problem. even if a clone of me would
> walk up to me right here and now, having sufficiently enough of my memories
> and claiming to have my 'identity', i would not care if this 'double' would
> not use my resources (e.g. credit card) and bureaucrats would leave me
> alone!
>
> cheers,
>
> miriam
>
>
>
> Am 06.12.2009 um 22:12 schrieb Thomas Buckner <tcbevolver@yahoo.com>:
>
>
>
> ------------------------------
> *From:* Matt Mahoney <matmahoney@yahoo.com>
> *To:* <sl4@sl4.org>sl4@sl4.org
> *Sent:* Sun, December 6, 2009 2:44:53 PM
> *Subject:* Re: Copying nonsense (was Re: [sl4] Uploading (was : goals of
> AI))
>
> Rewot Fetterkey wrote:
> > Can you clarify that? How, exactly, is consciousness nonexistent?
>
> By consciousness, I mean that which makes you different from a
> philosophical zombie as described in <http://en.wikipedia.org/wiki/Philosophical_zombie>
> http://en.wikipedia.org/wiki/Philosophical_zombie
> But by definition, a zombie is not distinguishable from you at all. I
> really don't know how much more clear the logic could be.
>
> The problem arises because all animals, including those that have no
> concept of death, have evolved a fear of those things that can kill them.
> Humans do have such a concept, which we associate with a lack of conscious
> experience. So we all desperately want to preserve this thing that does not
> exist. We can't help it. We are programmed that way.
>
> One way to deal with this conflict is to argue that the zombie argument is
> wrong and create ever more convoluted arguments to refute it. My preferred
> approach is as follows:
>
> 1. I believe that I have conscious experience. (I am programmed to).
> 2. I know that conscious experience does not exist.. (Logic irrefutably
> says so).
> 3. I realize that 1 and 2 are inconsistent. I leave it at that.
>
>
> -- Matt Mahoney, <matmahoney@yahoo.com>matmahoney@yahoo.com
>
> I'm with Daniel Dennet on this: the P-zombie is (to paraphrase an earlier
> poster) 2+2 = 5. Purely hypothetical, a character in a gendankenexperiment,
> The Man Who Wasn't There. In practice, any creature with a human brain that
> could say "Ouch, that hurt" has an internal process isomorphic to what we
> experience as consciousness. Please see my post of a few hours ago on the
> Edge talk. <http://www.edge.org/3rd_culture/dehaene09/dehaene09_index.html>
> http://www.edge.org/3rd_culture/dehaene09/dehaene09_index.html
> Consciousness in the human brain is a global pattern of activation and we
> now have methods of scanning and can say whether that pattern appears or
> not. This scanning has been applied to comatose/vegetative patients. From
> Dr. Dehaene's talk:
>
> "Let me just give you a very basic idea about the test. We stimulate the
> patient with five tones. The first four tones are identical, but the fifth
> can be different. So you hear something like dit-dit-dit-dit-tat. When you
> do this, a very banal observation, dating back 25 years, is that the brain
> reacts to the different tone at the end. That reaction, which is called
> mismatch negativity, is completely automatic. You get it even in coma, in
> sleep, or when you do not attend to the stimulus. It's a non-conscious
> response.
> Following it, however, there is also, typically, a later brain response
> called the P3. This is exactly the large-scale global response that we found
> in our previous experiments, that must be specifically associated with
> consciousness.
> (snip)
> The P3 response (a marker is absent in coma patients. It is also gone in
> most vegetative state patients — but it remains present in most minimally
> conscious patients. It is always present in locked-in patients and in any
> other conscious subject."
>
> Consciousness, according to Dr. Dehaene's findings, is how the human brain
> gets around certain limitations of being an analog computer. If you've read
> Eliezer Yudkowsky's posts you'll know that his approach to AGI would not
> necessarily call for the AGI to be conscious in the sense we understand. I
> recall he said "I'm not looking for the AGI to be a new drinking buddy, at
> least not at first" or words close to that. Paramount, to him, is that the
> AGI be Friendly, and not damage us intentionally or otherwise. While the
> human brain is a kind of analog computer, and much research is now afoot to
> emulate it on digital computers, our minds are not exactly computer
> programs. They are certainly not fungible programs running on a general
> computing machine, but rather embedded in the structure. The mind is not
> fungible unless the neural structure is made fungible, which may or may not
> ever be possible.
> To sum up, there's no real-world way a zombie could react as if conscious,
> using human brain architecture, without being conscious. Unless you believe
> in magic. And the subject of zombies, even if such could exist, probably
> doesn't really apply to the problems of building an AGI.
>
> Tom Buckner
>
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT