From: Matt Mahoney (email@example.com)
Date: Mon Feb 09 2009 - 08:55:08 MST
--- On Sun, 2/8/09, Filipe Sobreira <firstname.lastname@example.org> wrote:
> This whole topic is very interesting, but I find it hard to
> discuss without knowing what the others believe in some of
> the basic principles, so this is a question to the group as
> a whole: In what consists your theory of personal identity?
> What something needs to have to be called 'you'?
Animals and children that have no concept of death have nevertheless evolved to fear most of the things that can kill them. Death seems to be a well defined (learned) concept until you introduce ideas like AI, uploading, copying, teleportation, and the brain as a computer which can be programmed.
You don't know whether the universe is real or whether all of your sensory inputs are simulated by a computer that exists in a world you know nothing about. You don't know whether your lifetime of memories were programmed in the last instant by a computer in a world where time is a meaningless, abstract concept. If you were destroyed and replaced with an exact copy every second, each new copy would be unaware of it. If your memories were erased and replaced with that of a different person, the new person would be unaware of it. Exactly what is it that you fear?
The correct question is not what should you do, but what will you do? The way our brains are programmed, we will probably treat machines that look and act like people as people and give them legal and property rights. We will see our friends die and then appear to be brought back to life as machines, and want to do likewise. The implications are much easier to work out. We don't need to get hung up on meaningless discussions about the identities of atoms trying to define "you".
-- Matt Mahoney, email@example.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:04 MDT