From: William Pearson (firstname.lastname@example.org)
Date: Wed Apr 23 2008 - 10:41:22 MDT
2008/4/23 Matt Mahoney <email@example.com>:
> --- William Pearson <firstname.lastname@example.org> wrote:
> > What do you think would be required for a computer to be part of your
> > "self", rather than its own "self"? You are assuming that all
> > computers will always be other "selves", I don't think it is going to
> > end up like that.
> The distinction that humans commonly make is that if you communicate with it
> using language, then it is not yourself.
What exactly do you mean here by language? Are neural spikings a
language? Pheromones of ants?
That may be a useful rule of thumb for the moment, but I am not sure
how useful it will be if we ever invent the technological equivalent
of telepathy. Or would you think there would only be one person then?
Personally I'm going by application of the intentional stance. If the
majority of the time it helps you to make predictions about an enitity
or groups of entities (e.g. the cells of a human body, human+computer)
by assuming it is a single rational actor, then it is a single self.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT