From: Cliff Stabbert (email@example.com)
Date: Mon Jul 08 2002 - 19:05:59 MDT
Monday, July 8, 2002, 7:41:51 PM, Eliezer S. Yudkowsky wrote:
ESY> Cliff Stabbert wrote:
>> From GISAI (http://intelligence.org/GISAI.html#mind_thought_I ):
ESY> The updated version of this in LOGI (Levels of Organization in General
ESY> Intelligence) is:
ESY> Unlike GISAI, this contains at least one concrete example of how a mind
ESY> modeling itself in the first person is different from a mind modeling
ESY> itself in the third person.
Thanks for that link. I think that the more extended discussion you
offer there of when it's valid to use "I" offers further support for
the position that intelligence above a certain level of complexity is
not possible without awareness. Of course, we cannot prove awareness:
You write that
Legitimate use of "I" is explicitly not offered as a necessary and
sufficient condition for the "hard problem of conscious experience"
[Chalmers95] or social, legal, and moral personhood.
...which I'm not sure I even follow. My understanding of the "hard
problem" as Chalmers sketches it is not "which conditions are
necessary and sufficient to give rise to awareness," but rather "what
is consciousness/experience and how come it arises/accompanies certain
Two quotes from Chalmers' paper may be in order here:
[it's at http://www.u.arizona.edu/~chalmers/papers/facing.html, for
those playing along at home]
What makes the hard problem hard and almost unique is that it goes
beyond problems about the performance of functions. To see this,
note that even when we have explained the performance of all the
cognitive and behavioral functions in the vicinity of experience -
perceptual discrimination, categorization, internal access, verbal
report - there may still remain a further unanswered question: **Why
is the performance of these functions accompanied by experience?**
The emphasized question is what I take as the essence of his hard
problem. *That* there is something it is like to be a bat, he
does not seem to dispute:
It is undeniable that some organisms are subjects of experience.
In which case, it seems to me that the question of whether an AI,
another person, or other creature "actually" experiences consciousness
is formally undecidable, and irrelevant: whether they have awareness
seems more of a Consensus question, i.e., "we can tell" (leaving aside
boundary cases) can serve as a working definition.
To repeat: I very strongly suspect there are fundamental reasons why
intelligence of a certain order must be accompanied by what we would
agree, from the outside, is awareness. I cannot yet formulate those
reasons as well as I'd like but the section you link above would serve
as an excellent beginning.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT