Re: Universalising an AGI's duty of care

From: Philip Sutton (Philip.Sutton@green-innovations.asn.au)
Date: Mon Jul 18 2005 - 21:16:27 MDT


Hi Ben, Joel,

Ben:
> Again, the interpretation of the terms in these sentences requires a
> lot of human context -- I wouldn't assume that nonhuman AGI's or
> alients would interpret them the same way as us... The definition of
> what is an "other" (i.e. what is a whole system -- a cell? a human? a
> society?) isn't even an absolute given...

Agreed that there is a lot of ambiguity. The best way for that to be avoided
would be an ethic of sticking striclty to one's own planet/home location to
minimise contact with anything an AGI could have trouble interpreting. If
AGIs are not rule bound not to leave their home ground (ie. are ethically free
to move into the universe then they would have to deal with the
ambiguity/uncertainty that you have identified. All they can do is their very
best - however inadeqauate that is. The task for AGI designers is to create
a set of goals/ethics that would ensure that 'the best that can be done' is
really good.

Joel:
> I think that these golden rules are not something I want to have alien
> species use in judging how they treat me. For instance - a species may
> believe in life sacrifice and that it is the highest honour, however
> unless I believe in their spirituality I don't want an FAI deeming it
> okay because it fits these golden rules (presumably any of the aliens
> would volunteer and *want* to be the sacrificial individual).....

I think the less common golden rule 2 would cover what you are concerned
about: "don't do unto others what they would have you not do unto them"
This would require an AGI to find out your preferences.

Joel:
> When other sentient beings actions *intentionally* put the primary
> interests (i.e. existance) of other individuals at risk, then they
> should be prevented from doing so.

This is basically a "thou shalt not kill (another sentient)" rule as a sort of
minimum safeguard.

It raises the point the Ben made about what does sentient mean in the
context of non-Earth life (and there's just a bit of ambiguity about what it
means on Earth too). And, are there sentient life forms somewhere out
there in the (multi)universe for whom the notion of killing would be hard to
interpret? Could be.

Which brings me back to the point that trying hard to be friendly seems like
a desirable goal - even if mistakes could be made. But to not try to be
friendly increases the probabilty, scale and number of disasters.

Cheers, Philip



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT