Re: About that E-mail:...

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Sep 30 2000 - 19:22:42 MDT


Josh Yotty wrote:
>
> I'm willing to bet the people working toward superhuman intelligence will be hunted down. Of course, the people hunting us down will be irrational, ignorant, narrowminded and stupid.

Be careful what you fear. Sufficient amounts of hatred tend to turn into
self-fulfilling prophecies... and if somebody really did try and hunt me down
I sure wouldn't want to underestimate them.

You'd be amazed at how often witch-hunts don't happen in First World
countries. I can't think of anything I ought to be doing in advance to
prepare for the possibility of violent protesters, so I don't intend to worry
excessively over the possibility until it starts actually happening. There
are essentially two strategies to deal with anti-technology crusades; you can
try to run quietly and unobtrusively, or you can try for a pro-technology
crusade. I've observed that ordinary people tend to grasp the Singularity on
the first try; it's the people who think they're intellectuals that you have
to watch out for - so the second possibility is actually plausible. I don't
know if running quietly is plausible - it depends on how long it takes to get
to a Singularity. It's starting to look as if we don't bring the issue into
the public eye, Bill Joy will.

Presently, I think it's not too much to hope for that the future will not
contain anti-AI terrorist organizations. There are anti-GM groups and
antiabortion groups, but it's harder to get public sympathy for a violent
crusade against something that's only a possibility - I hope.

If we do bring the issue into the public eye, turning it into an elitist issue
isn't really going to help.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT