From: Eliezer S. Yudkowsky (email@example.com)
Date: Wed Jul 27 2005 - 09:29:39 MDT
Russell Wallace wrote:
> Sort of. We're talking about "the will of the '...Machine'", but being
> a machine, it has no will of its own, except insofar as it is
> constructed to have one. The original version of Collective Volition
> suggested that it will _not_ have any will of its own, but just
> implement the extrapolated volition of humans _under the assumption
> that they know they are merely part of the Collective and there is no
> escape_ - that's what'll turn it into Hell.
From that eternal optimist, Rudyard Kipling:
Whether the State can loose and bind
In Heaven as well as on Earth:
If it be wiser to kill mankind
Before or after the birth--
These are matters of high concern
Where State-kept schoolmen are;
But Holy State (we have lived to learn)
Endeth in Holy War.
Whether The People be led by The Lord,
Or lured by the loudest throat:
If it be quicker to die by the sword
Or cheaper to die by vote--
These are things we have dealt with once,
(And they will not rise from their grave)
For Holy People, however it runs,
Endeth in wholly Slave.
Whatsoever, for any cause,
Seeketh to take or give
Power above or beyond the Laws,
Suffer it not to live!
Holy State or Holy King--
Or Holy People's Will--
Have no truck with the senseless thing.
Order the guns and kill!
Once there was The People--Terror gave it birth;
Once there was The People and it made a Hell of Earth
Earth arose and crushed it. Listen, 0 ye slain!
Once there was The People--it shall never be again!
But there is quite a huge distinction, Russell Wallace, between asking which
actions people choose, and asking which real-world consequences people would
abhor upon experiencing them. What you do not seem to realize is that even
your horror of a dark future exists only in you, and there is nowhere but
human beings that I can send an AI to find it. I do not wish to be ruled over
by The People, and there is nowhere *but* people to find such positive
determinants of the future.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Mon May 20 2013 - 04:01:01 MDT