Re: How hard a Singularity?

From: Eugen Leitl (eugen@leitl.org)
Date: Sun Jun 23 2002 - 05:10:47 MDT


On Sat, 22 Jun 2002, James Higgins wrote:

> Dam! I haven't laughed that hard in YEARS! Laws? Are you really serious?

Yes, laws, backed by barrels of guns. The nasty old spiel.
 
> I bet you believe in gun control too (because we all know if you outlaw all
> guns then criminals won't use them, right?).

It depends on the nature of surveillance and enforcement, but, yes, you
can completely outlaw guns for all practical purposes. Technology actually
increases the asymmetry between thugs/spooks and subjects.

I'm actually monitoring the legal and technological landscape from a
cypherpunk perspective, and it certainly doesn't look good. Recent 9/11
craze is just the (rather unpalatable) icing on the cake of longterm
trends.
 
> Why the HELL would anyone working on the Singularity obey the law? By the

Because he otherwise will get his pants sued off, lose his entire assets
and will land in jail for the rest of her life, or worse?

> time anyone even realized that they didn't it would be too late. Not to
> mention the fact that it takes government bodies YEARS to understand
> moderately complicated technology. Look at the Internet for an example,
> many government bodies still don't get it; and that is based on 20+ year
> old technology...

The global communication networks are rapidly becoming one of the most
transparent, most subject to spook scrutiny places mankind ever created.
In absence of cryptoanarchy (which has very mundane reasons for not
materializing) technology advances immediately translate into better
surveillance and control. Here AI technologies (ubiquitous realtime
biosignature capturing, data warehouse analysis) are a distinctly
double-edged sword.
 
> Do you suppose the superintelligence will obey the laws too? (sorry,
> couldn't resist)

I specifically excluded feasibility of control on later stages of
Singularity. It is a degenerate development in that respect, since it
tends so erase information about its nucleation event. It has a natural
basin of atraction.
 
> And these regulations is why there have never been any bio-terrorism
> problems? Oh wait, there have been. Well, all it would take is ONE

Actually, we don't have any bioterrorism problems. The amount of
disruption (which is very real) is completely within people's heads. I
don't think sustainable disruption is possible given that people will
adapt, up to a point. (Remember bombings of England's cities during WWII).

I put that watershed threshold at megadeaths due to bioterrorism. If you
look at the outcome of a number of current simulations, this is a very
high threshold to achieve for a small group. Apart from high efficiency
into modes of deployment it absolutely requires pathogen engineering.
Effective pathogen engineering is decades away even for a major government
lab.

> Singularity incident and that's it. Thinking that ANY regulation could
> have ANY noticeable effect on this problem is ridiculous. Unless you plan

Thinking that regulations have no effect whatseover is naive.

> to take away everyone's computers and closely observe what is being done
> with the few computers you do allow. Is that what you are proposing?

This is not what I'm proposing, but it can be done, and it will be done if
we're really unlucky. In fact I can point you to a large contemporary
initiative from the side of the industry, spearheaded by copyright
enforcement (but 9/11 events do make it easier).
 
> My apologies, in advance, to the list for not playing nice in this post. I
> just couldn't believe this is a serious suggestion.

You're very welcome to shoot my argumentation full of gaping bleeding
holes. (I, of course, reserve the right to return the same in kind ;P ).



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT