Re: moral symmetry

From: Simon McClenahan (
Date: Tue Dec 31 2002 - 20:51:57 MST

On Tue, 2002-12-31 at 18:07, Wei Dai wrote:
> Eliezer suggests in CFAI that a Friendly AI may decide to divide the solar
> system into six billion equal parts and give each person property rights
> over a six-billionth share.

I haven't read CFAI in a long time, but to me this doesn't make sense.
No-one owns anything. Ownership is a human, maybe even animal, concept
that is a strategy for the strong to get stronger and hence the weak to
get weaker, So long as everyone is playing the same ownership Game. A
hostile intelligence would use the ultimate divide and conquer strategy
to claim ownership of all. Why can't we just share the whole Universe,
and why should a "strong" being who has a higher tolerance for survival
with limited resources be given equal or more than a "weaker" being with
a lower tolerance?

Semantically speaking, if an AI is going to be friendly to humans, it
should provide access to an abundance of resources, with no
over-consumption. Then maybe humans would learn to be friendly to each
other as well.

To quote from the recent (terrible) /Spiderman/ movie, "With power comes
great responsibility." A >H AI (or sysop? Are we still using that term?)
that has >H power will have responsibility, and humans will be the
benficiaries. Ownership will only create conflict. I know it sounds very
spiritual and philisophical, but there really is no law of nature that
includes ownership as a requirement for perpetuating life. It has caused
mostly problems for humans so far. I seriously doubt that an advanced
society would value ownership so highly.

If I get my fair share of space, I'll share it with anyone who asks for
it and is friendly enough to do the same.


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT