RE: How hard a Singularity?

From: Ben Goertzel (ben@goertzel.org)
Date: Wed Jun 26 2002 - 23:08:17 MDT


Hi Stephen,

> As miltary ethics are somewhat codified I believe a military AGI
> may in fact be a
> *more* desirable case for AGI development. Again this is said from the
> viewpoint of one who trusts the US military and who supports our current
> missions.

Oh man this is a scary day of e-mails!!!

First we have Eli, who thinks that whomever can build an AGI is
intrinsically possessed with tremendous wisdom...

And now, this assertion that the US military should be trusted with AGI and
the Singularity...

My Singularity optimism is rapidly getting the jitters! Eugen, maybe you're
right.... (Just kidding -- partly...)

I don't doubt that there are many trustworthy, moral individuals within the
US military. However, I do not trust the US military as an organization,
no way Jose'. This is the crew that invaded Grenada... the organization
that nuked Japan (probably both bombs were needless, but it's pretty damn
clear the second one was), that systematically tortured Vietnamese women and
children in the name of democracy and justice.... I'll spare you a full
list of examples. Ever read the first-person account of how US soldiers
chopped off the arms and legs of a Vietnamese woman, inserted dynamite in
her vagina and blew her up? Not an outlier occurence. Excuse me if the
impeccable morality of the US military seems a little questionable to me...

Yeah, the US military has done some good (I didn't like the Taliban either).
And some evil. And the people who run it are charged with protecting the
interests of the USA (the military and ECONOMIC interests, as history amply
shows), not with promoting the general good of the human race. If a
Singularity is gonna lead to the greater general good for sentient life, but
may lead to the dissolution of the US as an entity and the consequent
meaninglessness of the US military and ranks like "general" and "admiral",
do you think the Joint Chiefs are gonna go for it??? Hmmmm....

> My opinion is the opposite of yours in this regard. Of course I do not
> want a radical islamic government creating an AGI whose unity of will is
> with gun totting mullahs. Rather I want the US government
> military/civilian research institutions with whom I am comfortable to
> direct this effort.

The Singularity is directly opposed to the national interests of any
particular national government, because it will almost inevitably lead to a
"revolution" that will make national boundaries meaningless.

Thus, no government should be trusted to play a leading role in the
Singularity.

Conceivably, some organization such as the UN could be trusted with this
role, but the current UN is obviously not well-suited.

I agree that the US gov't would do a better job of governing the Singularity
than the Taliban. But this is a moot point because there are no
technologically savvy nations with governments as immoral and psychopathic
as the Taliban. The closest examples you could find would be China or
India, but their gov't's, though imperfect, are no Talibans, and they have
far less chance of achieving AGI than US, Western Europe or Japan in which
advanced computers are far more widespread.

I might accept gov't funding for AGI research, but I would never willingly
place control of an AGI in the hands of any military organization. That
really scares me. Those people are not Singularity-savvy, and I don't trust
they will become so in the future, not in any healthy way. Plus, their
interests are not those of the human race as a whole, let alone of sentience
as a whole.

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT