From: Ben Goertzel (email@example.com)
Date: Wed Jan 01 2003 - 09:40:00 MST
Colin Hales wrote:
> Imagine that inside our heads, intimately attached to and driven by the
> neural/glial activity we observe, is an as yet unspecified
> emergent effect.
> It 'is' the subjective feel, qualia, phenomenal consciousness (everyone's
> got their name for it!). The cells are trained in form and behaviour to
> express it as needed. Our learning is guided by the experience of it. We
> have priveleged access to it conferred by our 1st person role and also
> unspecified organisational details in the brain. Let's say it is based on
> phenomenon X.
> We know
> that any AGI
> without X has a huge zombifying hole in the feedback mechanism driving
> learning. No magic. No souls. Just good ol' brute physics at work
> IQ out of a given amount of 'stuff'.
> X is really Occam's-razor-simple and obvious, there in front of us all the
> time We're all going to go 'doh!', why didn't we see that before?
> Now, the Nobel prize winning phenomenon X is....<< take a stab, and no
> pinching Penrose/Hameroff stuff either :-) >>.
I don't agree with this, and I suspect most folks on this list do not...
I do agree that there is a lot we don't understand about consciousness & its
connection to physical reality.
However, I am not at all sure that we need to fully understand the nature of
consciousness, in order to create an AGI.
Similarly, we don't need to fully understand the nature of energy to create
an engine. And we don't need to fully understand the nature and origin of
life to create an artificial organism (a project Craig Venter and his team
are now working on).
In another decade, you and I may be chatting on this list with an AGI
system, mutually pondering the mysterious and beautiful nature of the
awareness we all share...
Having said that, I do have my own speculations regarding consciousness,
which I'll briefly share. I stress that these speculations are only loosely
related to my own practical AGI work.
Firstly: As for the "X-factor" underlying consciousness, my money is on
randomness, which I believe must be considered subjectively -- "randomness
with respect to a particular observer." [Roughly speaking, X is random with
respect to observer O if O cannot produce a better program for computing X
than "list the components of X". This has been extensively formalized in
algorithmic information theory.]
The mind of a physical system is a fuzzy set whose elements are drawn from
the set of patterns in that system. The set of patterns in the system form
an "emergent dynamical system" related to, but different from, the dynamical
system of the physical system itself.
The dynamics of the mind-system associated with a physical system, may be to
some extent unpredictable (i.e. random) with respect to *the mind-system
itself*. This leads toward the phenomenon of consciousness.
The *intensity* of consciousness has to do with the rate of flux between the
random and ordered realms, occurring in a mind-system. The emergence of
patterns in portions of the mind-system that were previously opaque to the
mind-system, and the decent into opacity of portions of the mind-system that
were previously patterned (in the mind-systems's own perspective).
This is yet another one of those interesting lines of thought that I've put
on the back-burner for a while, focusing instead on revising the in-process
Novamente book, and on Novamente-based product engineering...
But if this line of thinking is at all on-target, then consciousness is a
systemic, emergent phenomenon, not tied to any particular physical
In fact, I suspect that a correct understanding of consciousness as a
systemic, emergent phenomenon *may* be helpful in our understanding of the
physical universe -- the quantum theory of measurement, Grand Unified Field
Theory, etc. etc. Rather than looking to physics to save cognitive
psychology and AGI, I'm more inclined to look to cog psych and AGI to save
-- Ben Goertzel
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT