Re: [sl4] Uploading (was : goals of AI)

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Tue Dec 01 2009 - 21:02:18 MST


I wrote:
> Suppose there was a program that simulated you so well that nobody could
> tell the difference between you and the program in a Turing test
> environment. What is the probability that the program will be you after you
> shoot yourself?

Surely everyone recognizes my question as nonsense.

The more important question is: what will it take to convince you to shoot yourself?

The answer surely depends on the process by which the upload is created, even if the end result is the same. There can not be an "appearance" of death, because as we know, all animals are programmed by evolution to fear the things that can kill them.

So naturally we value continuity of experience, and continuity of the physical body, even though rationally it makes no difference. Having two copies exist at the same time destroys the illusion.

For a similar reason, it is important that nobody else can tell the difference between you and your copy. Otherwise they would believe the upload failed and not try it themselves. It doesn't make any difference to your copy if the memories are different or not, because it would be unaware of any differences.

It's an important question because the technology is coming. The first uploads won't be very accurate, but over time the technology will improve.

 -- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT