Re: Objective Meaning Must Exhibit Isomorphism

From: Stathis Papaioannou (
Date: Fri Mar 07 2008 - 03:12:14 MST

On 07/03/2008, Krekoski Ross <> wrote:

> > > The decimal expansion of pi looks random but contains any string of
> > > digits you care to contains islands of structure hidden
> >
> > > in the noise. If these islands of structure contain observers, they will
> > > be no less conscious for the fact that an observer outside the ensemble
> > > can't find them.
> >
> Thats an interesting thesis, but somewhat akin to saying, isn't it, that my
> DNA is aware since it contains the code necessary to generate me, given a
> program to execute it. I dont think that information is akin to
> consciousness, or even computation for that matter. The two are
> fundamentally different processes.

Actually I didn't mean to suggest that pi is conscious, just used it
as an example of structure hidden in noise - *if* the structures were
conscious (we can say they are computations rather than bitstrings)
they would not be any less conscious for the fact that they are

Interestingly, Clifford Pickover has speculated about this very idea:

> For example, if I were to swallow tetrodotoxin or some other neurotoxin, I
> would quickly die, but the fundamental structure of my brain would remain
> unchanged for some time. I dont think that 'I' would remain conscious until
> my brain started to decay.

No, because consciousness requires a series of states, not a single
state frozen in time. But I would argue that there need be no causal
connection between the states: as long they have the right structure,
you will experience continuity of consciousness. It's just that
usually the states aren't produced with the right structure unless
they are causally related.

> similarly, if I were to describe the position of every atom in my body with
> a specific number, that number would not exhibit conscious in and of itself.
> but even if it did, are all the possibly polymorphisms of that digit
> conscious in precisely the same way? a reversal of the number, subjecting
> the number to any one of several information-preserving compression
> algorithms, or reducing that number to its lowest possible KC value by some
> means.

Well, multiple realisability is a basic assumption of functionalism.
There is a mapping from computer A to computer B, so computer B is
performing the "same" computation as A. I see no basis for deciding
that some mappings, like the simple ones, are allowed while others are

> Lets say though that somehow, an abstraction of my atomic structure/energy
> is sufficient for consciousness, that such structure is conscious in and of
> itself. What if we were to create n clones of me, utilizing this abstraction
> as input into a program, lets further say that such clones are implemented
> in a computer program, and all are subject to the same internal environment.
> lets also say that each clone is housed in a small computer, maybe 10cm x
> 10cm x 10cm in size. its not important. If we create several instantiations
> of this program, several of these computers so to speak, and housed them in
> the same building. would there only be one instantiation of consciousness,
> given that its the same structure, even though each instantiation of this
> structure is not causally interactive with any of the others, or are there
> multiple consciousnesses?

There is effectively only one consciousness. The reason I say this is
that as long as there is at least one instantiation, it makes no
difference to it if other instantiations start or stop. Suppose there
are two copies of you, A1 and A2, running in perfect lockstep on
separate computers. You are *either* A1 *or* A2 at a particular
instant, but it is impossible for you to know which. If you are A1 and
A2 suddenly stops, you notice nothing unusual happening, since you
continue being implemented as A1. If you are A1 and A1 stops, you
notice nothing unusual happening, since you continue to be implemented
as A2. The latter may seem counterintuitive, but it is equivalent to
saying that if you went through a Star Trek type teleporter, you
wouldn't notice anything unusual happening (other than ending up
somewhere else).

Stathis Papaioannou

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT