RE: Seed AI (was: How hard a Singularity?)

From: James Higgins (jameshiggins@earthlink.net)
Date: Sun Jun 23 2002 - 13:03:16 MDT


At 08:37 AM 6/23/2002 -0600, Ben Goertzel wrote:
>Eliezer wrote:
> > A tremendous part of an AI is brainware. The most important
> > content - maybe
> > not the most content, but the most important content - is content that
> > describes things that only AIs and AI programmers have names for,
> > and much
> > of it will be realtime skills that humans can help learn but
> > which have no
> > analogue in humans.
>
>In my view, the most important content of an AGI mind will be things that
>neither the AI or its programmers can name, at first. Namely: *abstract
>thought-patterns, ways of organizing ideas and ways of approaching
>problems*, which we humans use but know only implicitly, and which we will

Ok, I'm not an expert but this seems obvious.

For example, I seem to be much better at solving abstract design problems
and intuitively understanding complex systems than most of the people I
have worked with. Even if this could be factually proven, neither I nor
anyone I know would be able to explain why this is the case.

>be able to transmit to AI minds implicitly through interaction in
>appropriate shared environments..

Except, I'm not so certain that we can "transmit [this] to AI minds
implicitly though interaction..." Maybe we can, but without understanding
the nature of it we can't (or at least shouldn't) make any assumptions. We
do, of course, need to try and implement this in the AI regardless if it
takes programming or teaching...

James Higgins



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT