From: Ben Goertzel (firstname.lastname@example.org)
Date: Wed Feb 27 2002 - 12:34:30 MST
> Eliezer wrote:
> > I can't speak for Ben, but certainly SIAI does not advocate (and never
> > the explicit codification of knowledge distilled from human experts.
> > Knowledge is learned complexity; it is abstracted from experience, not
> > hardwired by the programmers.
> There *must* be some explicitly codified knowledge in a Seed AI, at least
> enough to get it started... else, how does it think about anything?
If by knowledge you mean *declarative* knowledge, I disagree.
A baby AI need not have inbuilt declarative knowledge in order to
successfully learn and mature.
The alternative is for a baby AI to have explicitly codified *procedural*
knowledge, which is set up so as to enable it to somewhat efficiently gather
its own declarative knowledge (and more advanced procedural knowledge) from
A human baby has no inbuilt knowledge base (at least no one has shown that
it does), but it has inbuilt procedures such as grasping, wiggling and
sucking... and many more such procedures that kick in during development...
and these procedures are well-tuned to help it survive in the environment
it's born into (an environment that includes responsive adult humans).
I agree, if a baby AI system had no inbuilt procedural knowledge, it would
take a very long time to hit upon useful knowledge-gathering behaviors
relative to its environment.
> imagine that, once a Seed AI started to learn 'on its own', then
> its learned
> knowledge base would quickly become a lot larger than it's
> pre-entered data.
If one did supply one's AI with in-built declarative knowledge then, yes,
this would surely be true.
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT