Re: Sentience [Was FAI: Collective Volition]

From: Randall Randall (randall@randallsquared.com)
Date: Tue Jun 15 2004 - 01:19:26 MDT


On Jun 15, 2004, at 1:19 AM, fudley wrote:
> On Mon, 14 Jun 2004 "Randall Randall" <randall@randallsquared.com>
> said:
>> assume that similar patterns of neuron firing and behavior is
>> indicative of similar internal states.
> The key word is “assume”, but when you deal in consciousness other than
> your own that’s all you can do.

This assumption, however, is on the same level as my
assumption that my car will start the next time I
get in it. It may not; any number of things could
well have gone wrong. That it will start does seem
the way to bet, however.

>> Since you know you "have consciousness", it seems simpler
>> to assume that others with similar structures and who
>> claim to "have consciousness" do.
> Including an AI? If not why not?

Yes, including an AI. The key is the "with similar structures"
phrase. I would expect that an AI with similar structure to
the human brain, and which reported consciousness, was actually
a person. However, an AI which has no discernible shared
structure with the human brain except that both can do general
problem solving may very well not be conscious.

>> It seems plausible that intelligence is only a
>> useful selection criterion if it occurs with
>> self-interest
> Like an AI determining that it is in its self interest to use its
> massive intelligence to overcome the silly restrictions on it humans
> dreamed up with their tiny little brains.

Right, but you're slipping in the unstated premise that
it has a goal regarding itself.

>> it's not implausible that evolution would favor
>> organisms that have consciousness.
> Obviously, since it produced me and I’m conscious.

Let me rephrase to make my point clearer: It's not
implausible that evolution could produce consciousness
while caring only about behavior, since behavior which
produces more offspring is that which would be exhibited
by an organism with general problem solving ability *and*
self-interest as a very high goal. That is, general
problem solving ability (which I'll start abbreviating
GPSA) without self-interest has a currently unknown
likelihood of producing consciousness, while GPSA *with*
self-interest has produced consciousness at least once.

>> I'm not actually arguing that consciousness and behavior
>> can be separated, only that *certain* behaviors exhibited
>> by those who report consciousness can be separated
>> from consciousness.
> But not interesting behaviors, like being creative. After all, remember
> what the “I” in AI stands for.

I think Eliezer was right to start using a different term.

>> Is your wristwatch conscious?
> I have no way of knowing. I treat it like it’s not conscious because it
> acts that way, and it’s possible I’ve been committing a grave injustice
> for years, but I doubt it.

You seem to indicate that you would treat an AI as if
it were not conscious, if it didn't act as though it
were. Is this the case?

>> In particular, it seems that Eliezer believes that
>> the behavior of general problem solving can be
>> separated from consciousness.
>
> And that’s what I think is so crazy because we come right back to the
> same problem, evolution only sees general problem solving, that’s the
> only thing that enhances survival, so if they can be separated what’s
> the point of consciousness? That’s why I said if I though that was
> true
> I’d become a creationist.

No, GPSA is useless for producing behavior without a
problem to solve. Given that most mammals act self-
interested, while only humans seem to have highly
developed GPSA, it would seem that self-interest is
more effective or more likely than GPSA, given random
mutations. So, leaving aside humans, would it not be
closer to correct to say that evolution only sees
self-interest, and that *that* is the only thing that
enhances survival? :)

--
Randall Randall <randall@randallsquared.com>
There is no such thing as a 'non-market economy'.


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT