RE: Destruction of All Humanity

From: pdugan (pdugan@vt.edu)
Date: Mon Dec 12 2005 - 11:06:20 MST


I believe Yudkowsky has fowarded that opinion in a prior era, back around
2000, and has since rescinded his position to a much Friendlier stance that
such an SAI would probably have "bad" i.e. unfriendly programming at its
deeper structure. In other words just because an entity is technically
"smarter" than you by however many orders of magnitude does not make said
entity's statements the word of god, nor its actions ethically justified.

  Patrick

>===== Original Message From 1Arcturus <arcturus12453@yahoo.com> =====
>Someone on the wta-list recently posted an opinion that he attribtuted to Mr.
Yudkowsky, something to the effect that if a superintelligence should order
all humans to die, then all humans should die.
> Is that a wild misrepresentation, and like nothing that Mr. Yudkowsky has
ever said?
> Or is it in fact his opinion, and that of SIAI?
> Just curious...
>
> gej
>
>
>---------------------------------
>Yahoo! Shopping
> Find Great Deals on Holiday Gifts at Yahoo! Shopping



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT