RE: Tomorrow is a new day

From: Ben Goertzel (ben@goertzel.org)
Date: Fri Nov 05 2004 - 14:38:15 MST


Hi,

> As we have discussed, other AI designs, given a certain maturity, could
> simply read Cyc to acquire all that we have hand coded, as a short cut to
> understanding dictionaries and encyclopedias. On the other hand,
> we think
> that once Cyc is capable of learning by reading, learning by being taught
> by us using dialog, and capable of learning from experience, then we can
> subsequently plug in probabilistic adapters to ground out symbolic
> concepts from real-word sensations (robotics).

Well, a while ago we made some preliminary experiments loading in knowledge
from OpenCyc, and trying to do probabilistic inference to connect this with
knowledge obtained via

a) natural language parsing
b) analysis of quantitative data

Our tentative conclusion was that the style in which Cyc represents
knowledge is highly awkward when compared with natural language and also
with natural representations for quantitative data patterns. In principle
one can do automated reasoning to link Cyc knowledge in with knowledge
acquired through these other mechanisms, but it's much harder than it needs
to be, because of the various choices made in designing the Cyc
representational style.

Our tentative conclusion was that -- if we want to explicitly load knowledge
into Novamente -- it would be better for us to simply enter a dictionary
into Novamente through our interactive natural language understanding
interface. This way the knowledge of word meanings (an important kind of
commonsense knowledge) would automatically and initially be represented in
a style compatible with linguistically expressed knowledge. And our way of
representing NL knowledge has been designed to match up naturally with
knowledge derived via perception and action.

So, my hypothesis is that formally encoding knowledge in a style designed
without careful attention to both natural language and perception/action, is
not a very good idea. The different parts of the mind are highly
interdependent and each component needs to be thought of in terms of its
relationship with the others. In the case of Cyc, abstract declarative
knowledge and crisp inference on such knowledge was thought of basically in
isolation from all other aspects of mind, which seems to have resulted in a
knowledge base that is at best highly awkward to integrate into a coherent
overall mind-framework. I think this is the reason why not much practical
use has resulted from Cyc so far -- rather than the reason being that Cyc
hasn't yet reached some magical "critical size."

However, it *may* be possible, through carefully writing scripts and doing
probabilistic inference and using other tricks, to transform Cyc via a
sophisticated batch process into a mutant-Cyc that uses a different
representational style that is more compatible with integration into a
coherent mind-framework. I thought about this a bit 6 months ago and
haven't returned to the notion for a while.

[A note for those not familiar with Novamente: unlike Cyc, my own AI system
Novamente is not fundamentally based on formal-language knowledge encoding.
However, Novamente is capable of making use of formal language knowledge
encoding, as an augmentation to its experience-based and linguistics-based
learning processes.]

> There are AI projects based on probabilistic learning working towards
> knowledge richness, and alternatively there is Cyc, based upon first
> achieving commonsense knowledge richness (e.g. recently we added a
> sophisticated temporal reasoner) including how-to knowledge, and then
> working towards self-improvement and probabilistic learning.
>
> The beauty of our approach is that knowledge base improvement is a
> virtuous circle attracting sponsors.

Hmmm... Well, I see the beauty of this *economically* as a business model
focused on securing ongoing government funding, much more so than the beauty
in terms of achieving AGI... ;-)

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT