Re: An essay I just wrote on the Singularity.

From: Tommy McCabe (rocketjet314@yahoo.com)
Date: Fri Jan 02 2004 - 06:31:51 MST


--- Mitchell Porter <mitchtemporarily@hotmail.com>
wrote:
>
> >Survival? If the first transhuman is Friendly,
> >survival is a given, unless you decide to commit
> >suicide.
>
> Or unless it thinks you're better off dead.
> http://www.metu.edu.tr/home/www41/eda.doc

If it thinks you're better off dead, either 1), it is
for such a compelling reason that you agree and commit
suicide, or 2), the AI is unFriendly. Wouldn't you
call an AI that decided that someone should be dead
for no good reason unFriendly?

__________________________________
Do you Yahoo!?
Find out what made the Top Yahoo! Searches of 2003
http://search.yahoo.com/top2003



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT