Re: Friendliness SOLVED!

From: Peter de Blanc (
Date: Sun Mar 16 2008 - 09:14:58 MDT

On Wed, 2008-03-12 at 23:56 -0400, Thomas McCabe wrote:
> This is how humans usually act; it is not how most AIs will act. Iterm
> #1,782 on my agenda is to prove that, except for special cases, you
> get a higher expected utility when another agent shares your utility
> function than when the two agents have different utility functions.
> Hence, forced modification of the other agent's utility function also
> has positive utility.

The "special cases" are not as special as you may think. I have one for
you. There's a bomb in some populated area. You believe that cutting the
red wire will detonate it, and cutting the blue wire will disarm it.

So, to summarize your beliefs:

Red -> detonate
Blue -> disarm

Alf and Beth both believe:

Red -> disarm
Blue -> detonate

Alf wants to detonate the bomb, and Beth wants to disarm it. Which one
would you rather have standing next to the bomb with wire-cutters?

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT