Re: FAI means no programmer-sensitive AI morality

From: Eugen Leitl (eugen@leitl.org)
Date: Sat Jun 29 2002 - 11:07:08 MDT


On Fri, 28 Jun 2002, Eliezer S. Yudkowsky wrote:

> The *entire point* of Friendly AI is to eliminate dependency on the
> programmers' morals. You can argue that this is impossible or that

How does it matter? They're still monkeys dabbling in Golem-making.

> the architecture laid out in CFAI will not achieve this purpose, but
> please do not represent me as in any way wishing to construct an AI
> that uses Eliezer's morals. I consider this absolute anathema. The

Do not taunt Rabbi Loew, orelse he'll kick the shit out of you.

> creators of a seed AI should not occupy any privileged position with
> respect to its morality. Implementing the Singularity is a duty which
> confers no moral privileges upon those who undertake it. The
> programmers should find themselves in exactly the same position as the
> rest of humanity. If morality is objective the AI should converge to

Correction: the rest of the humanity does not attempt to build a Golem.

> it. If morality is subjective, then you have to be content with the

"Objective morality"? This is some pretty strong religion you've got
going there.

> AI randomly selecting a morality from the space of moralities that are
> as good as any other. This is what you're asking the rest of the

The exact morality chosen doesn't matter, as long as you don't have Dr.
Goebbels himself implementing it, with collaboration of Dr. Mengele. The
question is rather: where do we go from here?

> planet to do; how can you ask them to do that if you're not willing to
> do it yourself? Ben's statement that his AI is good for Ben Goertzel

You're not making sense here.

> is anathema to me. What about everyone else on the planet? Is it

A schism, a schism!

> rational for them to try and shut Ben down? The programmers have to
> find a way to place themselves in the same position as everyone else
> on the planet. Again, you can claim that this is impossible or that

Right, so they should forget about the project. Thanks a lot in advance.

> my proposal for doing it is unworkable, but this is what I believe is
> the critical responsibility of anyone undertaking to enter the
> Singularity.
>
> In the words of Gordon Worley: "Oh, well, plenty of us were
> anti-Friendliness until we actually sat down and read CFAI."

Hmm, hmm. We'll see.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT