RE: About that E-mail:...

From: Ben Goertzel (ben@intelligenesis.net)
Date: Sat Sep 30 2000 - 19:45:12 MDT


Here's another point

If the first real AI is a commercial enterprise, it'll be making people
money

Everyone will own stock in real AI ... it'll be a huge popular sensation ...
the financial aspects may drown out any troublesome philosophical aspects in
the public
mind...

if they're making money off it in the short run, not many people will really
be thinking
about the long run -- this is typical homo sapiens shortsightedness, which
will work in the favor
of cosmic evolution in this case

-- ben goertzel

> -----Original Message-----
> From: owner-sl4@sysopmind.com [mailto:owner-sl4@sysopmind.com]On Behalf
> Of Eliezer S. Yudkowsky
> Sent: Saturday, September 30, 2000 9:23 PM
> To: sl4@sysopmind.com
> Subject: Re: About that E-mail:...
>
>
> Josh Yotty wrote:
> >
> > I'm willing to bet the people working toward superhuman
> intelligence will be hunted down. Of course, the people hunting
> us down will be irrational, ignorant, narrowminded and stupid.
>
> Be careful what you fear. Sufficient amounts of hatred tend to turn into
> self-fulfilling prophecies... and if somebody really did try and
> hunt me down
> I sure wouldn't want to underestimate them.
>
> You'd be amazed at how often witch-hunts don't happen in First World
> countries. I can't think of anything I ought to be doing in advance to
> prepare for the possibility of violent protesters, so I don't
> intend to worry
> excessively over the possibility until it starts actually
> happening. There
> are essentially two strategies to deal with anti-technology
> crusades; you can
> try to run quietly and unobtrusively, or you can try for a pro-technology
> crusade. I've observed that ordinary people tend to grasp the
> Singularity on
> the first try; it's the people who think they're intellectuals
> that you have
> to watch out for - so the second possibility is actually
> plausible. I don't
> know if running quietly is plausible - it depends on how long it
> takes to get
> to a Singularity. It's starting to look as if we don't bring the
> issue into
> the public eye, Bill Joy will.
>
> Presently, I think it's not too much to hope for that the future will not
> contain anti-AI terrorist organizations. There are anti-GM groups and
> antiabortion groups, but it's harder to get public sympathy for a violent
> crusade against something that's only a possibility - I hope.
>
> If we do bring the issue into the public eye, turning it into an
> elitist issue
> isn't really going to help.
>
> -- -- -- -- --
> Eliezer S. Yudkowsky http://intelligence.org/
> Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT