RE: Military Friendly AI

From: Ben Goertzel (ben@goertzel.org)
Date: Sat Jun 29 2002 - 22:41:41 MDT


hi,

> > As already stated on this list several times, we intend to give Novababy
> > read but not write access to the Internet, at first, until a
> lot of study
> > has been done.
>
> Please describe how this works.. as I'm not sure you know, simply sending
> a request to a vulnerable web server formatted in the proper way will
> result in the potential for a virus or whatever to be implanted.

I do not have time at the moment to go into details on technical stuff like
this.

There probably is no way to absolutely create an "anti-firewall", absolutely
preventing a system from ever doing any small damage.

However, by intelligently restricting the grammar of outgoing requests, one
can go very far in this direction.

The Net is a tremendous information resource for a growing AGI baby.... Not
using it at all, is simply not an option.

One option we've considered is to create a huge mirror of a large portion of
the Net, for the system's use. However, this would cost mucho dinero!

> Why not just agree, here and now, to have it ready 6 months before?

Ummm -- because "6 months" is an arbitrary figure set by you because it is
half the period of the Earth's rotation around the sun?

I am going to use my own common sense here. I will release this
documentation a reasonable period before we have a complete system ready to
roll, for you and other interested parties to comment on.

> > > So, since nowadays you are talking about having some kind of
> > > committee make
> > > the final decision,
> >
> > Actually, as I said very many times on this list, what I
> thought was a good
> > idea was an *expert advisory board*, intimately involved with
> the project
> > when it reaches near-takeoff stage. This does not imply that
> the advisory
> > board has final decision making power.
>
> Why not give them the power if you truly believe they have a higher level
> of combined wisdom than yourself?

Because a lot of things can go wrong with a committee, as Eliezer has
pointed out.

> > > if they come back to you and say "The .01%
> > > chance we have
> > > calculated that your AI will go rogue at some point in the far
> > > future is too
> > > much in our opinion. Pull the plug." you will pull the plug?
> >
> > In that exact case, Brian, it would be a hard decision. A .01%
> chance of an
> > AI going rogue at sometime in the far future is pretty damn small.
> >
> > What I'd really like the experts for is to help arrive at the
> .01% figure in
> > the first place, actually...
>
> So at this point, you can't answer my question? I guess it is one of those
> things best left to the heat of the moment :-)

What would your answer be, if the same question were aimed at you regarding
your own Ai project?

And once you tell me your answer, why should I believe you?

> > A consensus among a large committee of individualists is not
> plausibly going
> > to be achieved on *any* nontrivial issue.
> >
>
> What if they did?

If a diverse committee of transhumanist-minded individuals agreed that going
ahead with a Novamente-lauchned singularity was a very bad idea, then, I
would not do it.

[Unless there was some weird extenuating circumstance, like all the
committee members being brainwashed, or paid off, etc.]

However, this is not the same as your last question. Because a diverse
committee of transhumanist-minded individuals would be incredibly unlikely
to say "The .01% chance we have calculated that your AI will go rogue at
some point in the far future is too much in our opinion. Pull the plug."
This statement bespeaks a lack of appreciation of the possibility that the
human race will destroy itself *on any given day* via nuclear or biological
warfare, etc. It is not at all the kind of statement one would expect from
a set of Singularity wizards, now is it?

I honestly do not believe we're *ever* going to be able to reduce "the
probability that an AI goes rogue at some point in the far future" to less
than .01%, with any meaningful degree of confidence. This kind of certainty
is not going to be available pre-Singularity.

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT