RE: How hard a Singularity?

From: Ben Goertzel (ben@goertzel.org)
Date: Wed Jun 26 2002 - 18:35:23 MDT


James Higgins wrote, responding to Eli:
> >It isn't a small thing to
> >succeed in
> >building a real mind. I would consider it a far greater proof of wisdom
> >than membership in any committee that ever existed.
>
> I would not. Building an AI would require intelligence, not wisdom. I'm
> not certain if a person without a great deal of wisdom could construct an
> AI with a great deal of wisdom, though. Haven't actually thought on that
> matter.

I agree very strongly with James here.

The abilities required to create an AGI are certainly *not* the same as
those required to make good judgments about the wisdom of launching the
Singularity at a given moment.

Making an AGI may require acute self-awareness of a sort -- to the extent
that introspective psychology is one among many sources that may be useful
for AGI design. But self-awareness is not equivalent to wisdom. There have
been highly self-aware (and highly intelligent) psychopaths [please note: I
am not calling anyone on this list a psychopath, just making a general
point!]. Furthermore, one can imagine paths to creating an AGI that don't
require particular self-awareness (e.g. detailed biological modeling).

Of course, "membership in a committee" is not proof of wisdom. The idea is
to form a committee of people with Singularity-favorable perspectives who
seem to have demonstrated some wisdom in their lives and thinking
beforehand...

I can't help but say it: The notion that "creating an AI is some kind of
'proof of wisdom'" seems to me a markedly *UN-wise* judgment ;->

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT