RE: Is generalisation a limit to intelligence?

From: Ben Goertzel (ben@webmind.com)
Date: Sat Dec 02 2000 - 12:45:01 MST


Hi,

> > Perhaps you're assuming that erroneousness itself adds some
> useful creative
> > "noise" to the thought process, whereas a system with a good
> enough memory
> > won't make enough errors to lead to creative thoughts. It's
> possible. Again,
> > we lack the science to quantify this effect.
>
> That's *exactly* what I mean. Thank you, sometimes I'm very bad
> with words. I
> was trying to point out the trade-off between errors plus
> creative intelligence
> and perfection plus rigid stupidity. Generalisation, as I see it,
> seems to be
> necessary but also limiting.

But, the refutation: A sufficiently intelligent, self-aware system is
quite capable of modifying itself to make itself MORE ERROR-PRONE if it
finds through experimentation that this makes it more intelligent ;>

In practice, Webmind is plenty error-prone, using lots of approximate
reasoning
and uncertain generalization. So the problem you describe seems to apply to
a far-future
situation of hardware plenty...

> > IN a data-rich situation, more memory decreases the severity of
> overfitting,
> > in general.
>
> This, however, I don't understand. Do you mean that overfitting
> itself decreases
> as the set of data increases, or that overfitting to a large data
> set is OK? If
> you mean the former, it seems to me that an AI with infinite
> capacity and finite
> knowledge will always suffer from overfitting and thus be too
> rigid and not
> really intelligent at all (although it might fool most people,
> what with its
> infinite capacity and all). If you mean the latter, I don't think
> I understand
> how overfitting can be a good thing.

What I mean is that even if there is a LOT of data, and it's highly varied,
there is still
a certain amount of overfitting that is inevitable.

On the other hand, the more memory you have, the more of this data you can
keep in mind for use
for new model-building rounds based on new data combined with the old. So
the maximum-memory system
will achieve the minimum amount of possible overfitting given the data.

Ben



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT