Re: Objections to FAI 4.2.2

From: Gordon Worley (redbird@rbisland.cx)
Date: Sat May 12 2001 - 07:49:57 MDT


At 3:53 AM -0400 5/12/01, Eliezer S. Yudkowsky wrote:
>Gordon Worley wrote:
>>
>> Maybe the real point is that there is no such thing as a friend for
>> evolutionary purposes; friends are only created out of cognition.
>> Other are merely associations that there is a benefit to act friendly
>> towards, and maintain a faux friendship even when it is not always
>> immediately a good idea to be friendly.
>
>No, unconditional friendship/altruism/etc. is evolutionarily real;
>friendship is not a strict subgoal in the human mind. See "The Origins of
>Virtue" by Matt Ridley.
>
>I'm pretty sure I describe the evolutionary utility of unconditional
>emotions in FAI at some point, probably the section on anthropomorphic
>injunctions or anthropomorphic ethical injunctions.

You do.

Okay, I understand that unconditional friendship exists from an
evolutionary standpoint, but since I don't experience it so I don't
really grok it (but I can accept it and understand the facts of it).
Oops, that might give away that I'm an AI. ;-)

>No, "selfishness" is being used in exactly the same sense in both places.
>By distinguishing between "selfishness" and "greed" you're assuming the
>presence of a bunch of nearby social equals who will stomp you flat if you
>get greedy.

Not at all. A person can be greedy without anyone around. For
example, I'm living all by my self in the Hidden Valley. In the
Valley, there is only so much to eat. If I'm selfish, I understand
that I can only eat so much food at one time, no matter how hungry I
might be, because if I eat it all and there is no more food, I'll
starve to death. If I'm greedy, I fail to realize that my eating a
lot will affect me, so I eat a whole bunch, have my fill, but all the
food is dead and won't grow back, so I'm dead after about a week.

>A transhuman AI has no social equals, and the straight-line
>projection says that a selfish transhuman AI with no other cognitive
>complexity would act like a bacterium. A selfish human acting like a
>bacterium is being foolish, unless there are no other humans to object, in
>which case selfishness translates directly to greed.

Okay, I agree with this and I think that I even restated this in my
own words. But, the point is, if you start being greedy, you are no
longer acting selfishly. One must disregard the sense of self and
that what one does affects oneself in order to be greedy. Continuing
to call this selfishness is not really fair, since there is no longer
a concern for the self (in that the AI can look out at the world and
recognize verself, seeing the effects that vis actions will have on
verself).

> > Anyway, a lot of this is just a matter of diction, but I think that
>> they are very important, especially in a paper like FAI.
>
>I agree, but for the record, I'm a perfectionist anyway; you can scold me
>for anything, no matter how trivial.

I'll keep that in mind. Actually, I guess it comes out that I'm a
perfectionist too (but only as a side effect of having OCD), because
otherwise I wouldn't have bothered to write this up. ;^)

-- 
Gordon Worley
http://www.rbisland.cx/
mailto:redbird@rbisland.cx
PGP Fingerprint:  C462 FA84 B811 3501 9010  20D2 6EF3 77F7 BBD3 B003


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT