From: Lee Corbin (firstname.lastname@example.org)
Date: Wed Apr 02 2008 - 00:12:33 MDT
> Lee wrote:
>> Matt writes
>> > Suppose you have a 3 way Turing test between a
>> > human, a GLUT, and a human whose memory is
>> > set to read-only at the start of the test
I guess I didn't understand that part. I thought that
you meant that for the duration of a test, e.g. between
times t1 and t2, no *new* memories may be acquired.
That sure *sounds* like what you meant! No?
>> > (of equivalently, whose mental state is reset after
>> > each question). Is the read-only human conscious?
So I responded
>> Of course the read-only human is conscious.
>> Suppose it's only Homer, and he's only reciting
>> the Iliad around a campfire. We adjust his nervous
>> system so that *during the narration*, on which
>> he is totally focused, neither the campfire, nor
>> any of the sounds, nor any other of his senses
>> are leaving any record whatsoever in his memory.
>> (Naturally we have to idealize this a bit in order
>> to address, in principle, the answer to your
>> idealized "read-only human" question.)
> That is not the same. He will still remember remembering.
Your brief reply fails to specify where my example goes
wrong. *During the test*, while Homer is reading, he is
of course recalling his past life (he has to, for among other
reasons, to be able to recite the Iliad). But he is forming
no *new* memories, nor is he going to be able later to
remember remembering, where the latter "remembering"
specifies a process occuring between t1 and t2. That
remembering did indeed happen, but later he won't
remember that it did.
In fact, during the narration he doesn't happen to recall that
a moment earlier he had said such and such---by my
stipulation, he is able nonetheless to recite the whole thing
the same way that you, while reciting the alphabet, do not
remember (unless you try) reciting the letter 'C' when you
have got to 'Q'.
>> It may be easy for some to say that there is no such thing as consciousness,
>> but someday we'll have much more thorough characterizations of it than
>> we do now, and the argument that they're wrong will be all the more solid.
> If you define consciousness as a sequence S of algorithmically similar states
> (K(S_n+1|S_n) = O(1)) then certainly it exists.
First, I don't define consciousness as a sequence of states,
similar or not. You could aim your question at Stathis, for
example. Moreover, I don't follow your mathematical
notation at all. Is the O supposed to be Landau's "big-oh"
notation? (I gather that K is Kolmogorov complexity, though.)
>> In the meantime, you can choose to believe that the Alien itself and its
>> many copies were no more conscious while they were answering the
>> 26^10000 possible questions, than is the relatively simple little lookup
>> device they left behind. If so, then you're quite wrong.
> They are not functionally equivalent. The Alien has episodic memory and the
> GLUT does not.
Yes, I agree. Even though both the Alien easily passes the Turing Test,
and we have to affirm his intelligence, so we are also tricked into believing
that the Giant LookUp Table has passed, since we are not given the
fact that it's only a recording of innumerable efforts on the part of the
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT