From: Ben Goertzel (firstname.lastname@example.org)
Date: Thu Apr 27 2006 - 12:27:28 MDT
> > I invite you to hypothesize any algorithm for processing n items
> > in memory, where any of the n items can be combined with any of
> > the other n-1 items, that takes computional resources linear in n.
> > THAT is the foolish notion.
> "not linear" != "exponential".
The subtle point here regarding human and near-term AGI cognition is
that the "n items in memory" shouldn't be considered arbitrary items
but are, rather, drawn from a distribution dependent on the organism's
environment and architecture and dynamics...
So, the worst-case complexity of the "processing" algorithm is not
what's important, but rather the average-case complexity over the
actual distribution of memory items encountered by the organism in
And, the organism may well have the tendency to unconsciously fill its
memory with item-sets that are more easily processed by its own
particular processing algorithms...
The concept of "algorithm" is applicable here, but the concepts of
"linear" and "exponential" and other orders of complexity need to be
handled with much care...
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT