From: William Pearson (firstname.lastname@example.org)
Date: Thu Apr 24 2008 - 13:31:46 MDT
Apologies if this appears twice, but I got a 550 error while trying to
2008/4/23 Matt Mahoney <email@example.com>:
> --- William Pearson <firstname.lastname@example.org> wrote:
> > 2008/4/23 Matt Mahoney <email@example.com>:
> > > The distinction that humans commonly make is that if you communicate with
> > > it using language, then it is not yourself.
> > What exactly do you mean here by language? Are neural spikings a
> > language? Pheromones of ants?
> I didn't say it was the right answer. People only use language (speech and
> writing) to communicate with humans (so far).
> Anyway, why does it matter what "self" is? Are your mitochondria part of you?
> When you drive, is the car an extension of your body?
If it is possible to make computers part of a human self, through
sufficiently advanced AI or whatever*, then it has vast consequences
on the potential paths the world will take.
If computers and the AIs developed later will always be separate
"selves", then conflict and paper clipping are likely and FAI is
needed. Not to say that humanity magnified by AI that has no separate
self will be all sweetness and light. We are quite capable of causing
conflict by ourselves.
* They aren't at the moment as they do not have any inclination
towards a purpose as such, and can be taken over by viruses and other
malware, so they need to be understood separately from a human from a
design point of view.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT