RE: How hard a Singularity?

From: Stephen Reed (reed@cyc.com)
Date: Wed Jun 26 2002 - 22:28:53 MDT


On Wed, 26 Jun 2002, James Higgins wrote:

> At 08:49 PM 6/26/2002 -0500, Stephen Reed wrote:
> >Currently I manage Cycorp's participation in a Darpa project and I
> >trust this process - and believe it will work for ensuring Friendly AI.
>
> Um, small problem here. DARPA, the Defense Advanced Research (something)
> Association, is part of the Defense Department. Or, more bluntly, the
> Military. What are the ultimate goals behind the research projects DARPA
> sponsors? To create technology useable by the military.

Agreed, but Darpa does not create technology *solely* for the
military/intelligence community.

> If such an agency were to attempt the Singularity (or just transhuman AI)
> it would certainly try to constrain and control the result. And it may not
> exactly want "Friendliness". After all, if the Defense Department needs to
> attack another country using information warfare, wouldn't you want your
> pet AI to be involved? Woulnd't work out very well if the AI always
> answered that the things it was asked to do were not friendly and, thus, it
> could not cooperate.

Let's talk about a use case to highlight your point. Certainly the
military would want an AGI to perform battlespace decisions affecting
tactical defense and offense. There are already highly automated systems
that await a (superior) replacement for the "human in the loop". So a
military AGI would be tasked to "put metal on a target" of its choice.

As I understand from studing CFAI the AGI would simply have a unity of
will with its developers. In the military case, this means an
understanding of duty, and the correctness of the mission. An AGI does not
necessarily have to be a pacifist. The military has good, and logical
arguments as to why we need to have a military.

As miltary ethics are somewhat codified I believe a military AGI may in fact be a
*more* desirable case for AGI development. Again this is said from the
viewpoint of one who trusts the US military and who supports our current
missions.

On the other hand a case to highlight my point -

Consider that the miliary seeks to avoid combat and that force is the last
resort of diplomacy. An AGI would in my opinion be more resourceful in
preventing combat, by detecting and solving the political problems that
lead to war.

> Not to mention the fact that if a Singularity were to be attempted by ANY
> government nation there would be so much special interest represented that
> it could never be done safely. Most likely, it could never be done.

My opinion is the opposite of yours in this regard. Of course I do not
want a radical islamic government creating an AGI whose unity of will is
with gun totting mullahs. Rather I want the US government
military/civilian research institutions with whom I am comfortable to
direct this effort.

-Steve

-- 
===========================================================
Stephen L. Reed                  phone:  512.342.4036
Cycorp, Suite 100                  fax:  512.342.4040
3721 Executive Center Drive      email:  reed@cyc.com
Austin, TX 78731                   web:  http://www.cyc.com
         download OpenCyc at http://www.opencyc.org
===========================================================


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT