Re: Mindless Thought Experiments

From: Krekoski Ross (rosskrekoski@gmail.com)
Date: Sun Mar 02 2008 - 14:33:32 MST


> Nick writes
>
> > The point, I think, is that it seems like there should be an objective
> > fact of the matter as to whether some physical system is having
> > conscious experiences;
>
> Right.

I dont know if its that simple.

>
>
> > but if computation is sufficient for experience, since what a physical
> > system is computing is subjective, this can't be the case.
>
> Oh, why not? Perhaps there is a terminological problem here, but
> to the physical system in question, of course the computation yields
> (subjective) experience. What a system is computing is objective,
> naturally, but the "consciousness" experienced, that old murky
> problem, "as seen from inside" by a device is all that is subjective.
>
> Lee

Why is whether something has experience or not even relevant? I don't
immediately know if any of you have conscious experience, but you'd pass a
turing test if I gave you one. Although that wouldnt tell me if you're
really conscious, its 'good enough' so to speak.

It doesnt mean we can't revise the criteria of the test, but theres never
'proof' that another entity is conscious. its not an argument that can be
resolved.

At any rate, the observation of output from a system that may or may not be
conscious may or may not be opaque. Thats perhaps a better avenue to think
about. The output of our own cognitive processes, (this email) is
transparent (I hope) to most of you-- why not, we have similar cognitive
architectures. The output of an arbitrary computer program built by a human
is opaque to some, transparent to some. This is also to be expected. The
output of a system that is another step removed would be more opaque, but
this is not to say it wouldnt be systematic or exhibit qualities of
intelligence if it was the byproduct of a reasoning system.

I personally think it will be glaringly obvious when we succeed in creating
an actual conscious entity. For the above reasons. The higher-order
organization and complexity we see in human behaviour, for example, is not
deduceable or even obviously related to the behaviour of a single neuron,
our low-level machine code, so to speak, is still not well understood. Yet
the higher-order level at which we interface with the world around us can be
much more isomorphic with our own experience. Obviously this is to be
expected, but at the same time, if we compare our own higher level
organization with that of an insect, which is of a fundamentally different
architecture, or even a bacterium, which is significantly more so, we see a
good deal of emergent similarities based on function, although we still dont
know what type of fundamental experience an insect or a bacterium has. This
is the kind of thing I think we need to watch for as evidence of
consciousness.

Rgds

Ross



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT