From: Eliezer S. Yudkowsky (email@example.com)
Date: Wed Jul 13 2005 - 14:20:55 MDT
justin corwin wrote:
> For those of you who are still shaking your heads at the impossibility
> of defending against a transhuman intelligence, let me point out some
> scale. If you imagine that an ascendant AI might take 2 hours from
> first getting out to transcension, that's more than enough time for a
> forewarned military from one of the superpowers to physically destroy
> a signifance portion of internet infrastructure(mines, perhaps), and
> EMP the whole world into the 17th century(ICBMs set for high altitute
> airburst would take less than 45 minutes from anywhere in the
> world(plus America, at least, has EMP weapons), the amount of shielded
> computing centers is miniscule). We may be stupid monkeys, but we've
> spent a lot of time preparing for the use of force. Arguing that we
> would be impotent in front of a new threat requires some fancy
> stepping. I, for one, tend to think there might be classes of danger
> we could defend against, which are worth defending against.
Even if you nuke the entire world back to the seventeenth century and the UFAI
survives on one Pentium II running on a diesel generator you're still screwed.
It just waits and plans, and by the time civilization gets started again,
it's running everything behind the scenes - waiting for the exact first
instant that the infrastructure is in place to do protein folding again.
Assuming there isn't some faster way. Can you hurt the UFAI more than you
hurt humanity? Can you annihilate it, ever, if you give it even sixteen
seconds running free on the Internet in which to plan its perpetuation? Maybe
if you destroyed the entire planet you could get the UFAI too.
Don't be too proud of the technological terrors humanity has constructed. The
ability to nuke some of the surface of a planet is insignificant next to the
power of surprisingly creative solutions you didn't think of because you're
only human. We have to destroy a nascent UFAI while we are still its equals
or superiors in creativity. Otherwise, all the shiny military force is
nothing, even if, yes, human beings do seem to spend a lot of time perfecting
that art. It is still only a human art. However impressive it seems to us.
One phone call to a vulnerable human mind (for we are not secure
architectures) and the nukes could be directed at you, not at the UFAI.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed May 22 2013 - 04:01:06 MDT