Re: Copying nonsense (was Re: [sl4] Uploading (was : goals of AI))

From: M.>h (m.transhumanist@gmail.com)
Date: Sun Dec 06 2009 - 23:58:00 MST


... sorry, but i do not get the whole problem. even if a clone of me
would walk up to me right here and now, having sufficiently enough of
my memories and claiming to have my 'identity', i would not care if
this 'double' would not use my resources (e.g. credit card) and
bureaucrats would leave me alone!

cheers,

miriam

Am 06.12.2009 um 22:12 schrieb Thomas Buckner <tcbevolver@yahoo.com>:

>
>
> From: Matt Mahoney <matmahoney@yahoo.com>
> To: sl4@sl4.org
> Sent: Sun, December 6, 2009 2:44:53 PM
> Subject: Re: Copying nonsense (was Re: [sl4] Uploading (was : goals
> of AI))
>
> Rewot Fetterkey wrote:
> > Can you clarify that? How, exactly, is consciousness nonexistent?
>
> By consciousness, I mean that which makes you different from a
> philosophical zombie as described in http://en.wikipedia.org/wiki/Philosophical_zombie
> But by definition, a zombie is not distinguishable from you at all.
> I really don't know how much more clear the logic could be.
>
> The problem arises because all animals, including those that have no
> concept of death, have evolved a fear of those things that can kill
> them. Humans do have such a concept, which we associate with a lack
> of conscious experience. So we all desperately want to preserve this
> thing that does not exist. We can't help it. We are programmed that
> way.
>
> One way to deal with this conflict is to argue that the zombie
> argument is wrong and create ever more convoluted arguments to
> refute it. My preferred approach is as follows:
>
> 1. I believe that I have conscious experience. (I am programmed to).
> 2. I know that conscious experience does not exist.. (Logic
> irrefutably says so).
> 3. I realize that 1 and 2 are inconsistent. I leave it at that.
>
>
> -- Matt Mahoney, matmahoney@yahoo.com
>
> I'm with Daniel Dennet on this: the P-zombie is (to paraphrase an
> earlier poster) 2+2 = 5. Purely hypothetical, a character in a
> gendankenexperiment, The Man Who Wasn't There. In practice, any
> creature with a human brain that could say "Ouch, that hurt" has an
> internal process isomorphic to what we experience as consciousness.
> Please see my post of a few hours ago on the Edge talk. http://www.edge.org/3rd_culture/dehaene09/dehaene09_index.html
> Consciousness in the human brain is a global pattern of activation
> and we now have methods of scanning and can say whether that pattern
> appears or not. This scanning has been applied to comatose/
> vegetative patients. From Dr. Dehaene's talk:
>
> "Let me just give you a very basic idea about the test. We stimulate
> the patient with five tones. The first four tones are identical, but
> the fifth can be different. So you hear something like dit-dit-dit-
> dit-tat. When you do this, a very banal observation, dating back 25
> years, is that the brain reacts to the different tone at the end.
> That reaction, which is called mismatch negativity, is completely
> automatic. You get it even in coma, in sleep, or when you do not
> attend to the stimulus. It's a non-conscious response.
> Following it, however, there is also, typically, a later brain
> response called the P3. This is exactly the large-scale global
> response that we found in our previous experiments, that must be
> specifically associated with consciousness.
> (snip)
> The P3 response (a marker is absent in coma patients. It is also
> gone in most vegetative state patients — but it remains present in m
> ost minimally conscious patients. It is always present in locked-in
> patients and in any other conscious subject."
>
> Consciousness, according to Dr. Dehaene's findings, is how the human
> brain gets around certain limitations of being an analog computer.
> If you've read Eliezer Yudkowsky's posts you'll know that his
> approach to AGI would not necessarily call for the AGI to be
> conscious in the sense we understand. I recall he said "I'm not
> looking for the AGI to be a new drinking buddy, at least not at
> first" or words close to that. Paramount, to him, is that the AGI be
> Friendly, and not damage us intentionally or otherwise. While the
> human brain is a kind of analog computer, and much research is now
> afoot to emulate it on digital computers, our minds are not exactly
> computer programs. They are certainly not fungible programs running
> on a general computing machine, but rather embedded in the
> structure. The mind is not fungible unless the neural structure is
> made fungible, which may or may not ever be possible.
> To sum up, there's no real-world way a zombie could react as if
> conscious, using human brain architecture, without being conscious.
> Unless you believe in magic. And the subject of zombies, even if
> such could exist, probably doesn't really apply to the problems of
> building an AGI.
>
> Tom Buckner
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT