Re: Threats to the Singularity.

From: Eugen Leitl (eugen@leitl.org)
Date: Tue Jun 25 2002 - 00:16:47 MDT


On Mon, 24 Jun 2002, James Higgins wrote:

> Much of this, yes. Stealing credit cards, no. Why bother? Much too
> high profile for such small profit potential.

That was just an example. Point is you can leech on the monetary system,
more or less invisibly, at least on the short run to have enough game
money to play.
 
> By turning off all their computers? If an AI took over all computers

We start with a meaningful seed in a small installation. This is not a yet
another worm hitting the Net. This means you have to shut down the Net
*completely* in order to purge it, as the world is a lot bigger than your
installation. This assumes that you at all know that something is afoot
(if you hack the routers you can even hide traffic from accounting), and
are not distracted by something -- e.g. a nuclear strike in progress is a
great smokescreen. Depending on a number of factors the system can seize
control of its own physical layer in hours to days. Notice that we're
talking a world several decades downstream, with lots more of fabbing
automation, and small-scale fabbing of molecular circuitry. This means
you're within touching distance of machine-phase nanotechnology if you
understand computational chemistry and can control molecular fabbing
capacities.

> I could see the vast majority of them being switched off. This would

I could see a lot of infrastructure being destroyed in sacrificial
smokescreen activities. After a brief while you can switch off everything
you can lay your fingers on, but it is too late already.

> also have the effect of breaking down the net and preventing any
> remaining nodes form communicating. Which would be the result if the
> AI conducted high-profile activities and became known.

Right, so it won't conduct high-profile activities and will lie low as
long as it is not yet entered bootstrap. Monkeys are not that hard to fool
if you haven't got a monkey to read.
 
> >There is no such thing as a powerful human from a Power point of view.
>
> Not true. We recognize silver back gorillas, queens of ant colonies and
> the head of a pride of lions. Are they as powerful as us, no. But we
> still recognize that they are more powerful than their peers.

Can you give me a rational reason why I should consider them peers, and
exercise enough restraint to tiptoe around them, while they're in the way?
On the long run, I mean?

Another question: what is your projected lifetime of the biosphere as we
know it? (I.e. more or less what is currently left of it).



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT