RE: AGI Prototying Project

From: Ben Goertzel (ben@goertzel.org)
Date: Mon Feb 21 2005 - 06:32:39 MST


> So, a number of sub-questions :
>
> * Is Friendliness a religion to be hard-wired in to AGI?

In a sense, yes.

> * Is a sectarian AI a problem for us, here now? Do we care if we just
> built what we can and impose our current viewpoint? Do we back our
> beliefs in a gamble affecting all people if we are successful?

That's too simple a perspective -- one can only impose one's current
viewpoint as the initial condition of a dynamic process. A viewpoint is not
the sort of thing one can expect to be invariant under a long period of
radical self-modification.

> * Is a non-sectarian AI a problem for us - do we care if someone ELSE
> builds a religious AI that we don't agree with?

Very much!

> Now, an assumption which I disagree with is that human life has any
> value other than its intelligence.

Well, any value to *whom*?

The human race has value to *me* other than its intelligence....

To the universe as a whole, it's not clear how much "value" *intelligence*
has (nor in what sense the concept of "value" applies)...

> There are four major ways to be frightened by AGI that come to mind now,
> ~ only one of which I think is worth worrying about.
>
> 1) Skynet becomes self-aware and eats us
> 2) AGI kills us all in our own best interests. How better to eliminate
> world hunger?
> 3) AGI needs our food, and out-competes us. Bummer.
> 4) AGI destroys our free will
>
> I am only worried about (1).

There is a lot of middle ground besides the extremes, combining factors of
several options...

> * In AGI, psychological instability will be the biggest problem, because
> it is a contradiction to say that any system can be complex enough to
> know itself.

Perhaps no complex AI system can know itself completely, but there can be
increasingly greater degrees of approximate knowing; humans are nowhere near
the theoretical maximum...

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT