From: Ben Houston (firstname.lastname@example.org)
Date: Sat May 26 2001 - 14:10:19 MDT
>If there is a large enough power differential between a
>and us, egoism will not imply any sort of mutualism. If it doesn't
>care about us, if we have nothing to offer it, and if we're in its
>way, we're toast.
Do you really think they would be that uncaring? I would suggest that a
super intelligence creature would treat us like we treat the lesser
animals on the earth. We don't purposely go out and kill things just
because their in the way (although we can be irresponsible if we don't
notice the consequences of our actions) -- we usually try to get lesser
animals out of our way in a "humane" way and only when we can not get
them out of our way do we resort to eradicating them.
We let raccoons live in the suburbs as long as they don't build their
homes in someone's attic... although raccoons then to get run over a lot
since they are not totally adapted to all the aspects of the artificial
From: email@example.com [mailto:firstname.lastname@example.org] On Behalf
Of Mitchell J Porter
Sent: Saturday, May 26, 2001 3:37 PM
Subject: superintelligence and ethical egoism
Jimmy Wales said:
> (It wouldn't be very intelligent if it were anything but...)
A superintelligence whose supreme goal is X only needs to care
about itself insofar as its own continued existence will assist
the achievement of X. If its goal is to blow up the earth, then
once that is done it can attach zero value to further
self-preservation and shut down entirely.
> I don't think we should fear this, by the way. We should hope for it.
(this being egoism in a superintelligence)
If there is a large enough power differential between a
and us, egoism will not imply any sort of mutualism. If it doesn't
care about us, if we have nothing to offer it, and if we're in its
way, we're toast.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT