From: Christian L. (email@example.com)
Date: Wed May 22 2002 - 03:15:29 MDT
>From: "Ben Goertzel" <firstname.lastname@example.org>
>Subject: RE: [JOIN] Stephen Reed
>Date: Tue, 21 May 2002 17:18:56 -0600
> > >In my opinion, Darpa is not yet ready to fund the Singularity mission,
> > >although as it gets closer, the US military will catch on first and
> > >provide funding before the commercial sector. The safety of the
> > nation is
> > >almost priceless.
> > I'm curious about this statement. My gut feeling is that the
> > military would
> > consider research aimed at the Singularity as something potentially very
> > *harmful* to the nation. If the research is successful, it would mean
> > END of the government and the military. My feeling is also that
> > there are a
> > great deal of narrow minded people in the government/military sector,
> > perhaps are afraid of such a massive upheaval.
> > Don't you think that the military would use their funds to support AI
> > intended for warfare instead of, say, Eliezer's Friendly AI project?
> > Have you yourself talked to people at DARPA (or other military programs)
> > about the Singularity? If so, what was their reaction?
>The military will have its own slant on the Singularity. It will put a lot
>of $$ into AGI with the objective of getting there before anyone else, on
>the intuition that if there is going to be a major breakthrough that alters
>the nature of the world fundamentally, they want to be the ones steering
>not the "bad guys."
This sounds reasonable, but isn't it likely that some people in the military
would prefer if the nature of the world didn't change fundamentally? And
would try to keep the world in its current state, even if this would be
futile in the long run?
Maybe my intuitions in this matter are shaped by too many bad Hollywood
movies depicting the miliary as evil...
A more likely scenario may be that the military funds narrow-field AIs, and
not AGI. Most people don't think that the Singularity is likely within our
lifetimes, and the military would not be an exception here I guess.
>Sure, it may not make any difference who steers the advent of the
>Singularity. And it may not make any difference whether one tries for
>Friendly AGI or not either. But the military, like Eliezer, will make its
>I don't think the military will place a huge focus on Friendly AI, but I
>suspect they will place more focus on defensive than offensive applications
>of AI. Offensive applications would be too easily taken by enemies and
>against us. Software is a lot easier for enemies to steal or copy than
>offensive weapons like nukes or fighter planes, say.
This is a very good point. Agressive AI would probably be a very bad idea
even if there was no risk of it being stolen...
Join the world’s largest e-mail service with MSN Hotmail.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT