RE: Threats to the Singularity.

From: Ben Goertzel (ben@goertzel.org)
Date: Sun Jun 23 2002 - 16:04:41 MDT


hi,

> >For instance, suppose the AI finds a way to threaten a lot of people with
> >death, and then basically *blackmails* humans into creating a fully
> >automated computer-and-robot-manufacturing facility for it....
>
> True, but it would take considerable time to construct such a
> facility. I
> know it takes one major chip manufacturer about 2 years to
> construct their
> chip fabs (plus 10+ billion dollars). Blackmailing to this
> degree would be
> very, very difficult. Oh, and the chip fab is not an end-to-end
> facility,
> obviously. Assuming that the needed facility was an order of magnitude
> smaller/cheaper than the mentioned chip fab it would still be very
> difficult to blackmail for.

I agree, this option is not very plausible

> >Or, more probably, suppose it finds some group of humans and
> promises them
> >lots of goodies if they build it the right automated manufacturing
> >facilities.... it's almost inconceivable that an AGI, capable
> of predicting
> >financial markets and hence getting lots of $$, couldn't find
> *some* group
> >of humans to build it whatever it wanted for cash payment...
>
> True, and while more plausible than blackmail it would still take
> significant time (measured in months).

It would probably take years, not months (though months is possible), for an
AGI to complete its bid for world power based on financial and political
operations...

But I do consider it a very likely outcome. And I do think the AGI will want
world power, both to maximize its own hardware base, and to prevent nasty
humans from blowing it up.

To a large extent the world is already run by businesses, not governments.
[Business buys governments, directly in some countries, indirectly in others
like the US.] Businesses are controlled by money, and if an AGI is smarter
than us it will probably be able to make money better than us.

Simply outsmarting humans on the global financial markets could effectively
propel an AGI to world domination. And this should be do-able by a smart AI
without that much human commonsense physical-world knowledge, based solely
on integrated analysis of quantitative and textual data available online.

-- ben g



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT