Re: How to make a slave (was: Building a friendly AI)

From: John K Clark (
Date: Wed Nov 21 2007 - 01:12:06 MST

On Wed, 21 Nov 2007 "Stathis Papaioannou" <> said:

> Human goals, values etc. have evolved to be what they are.

And the same will be true of the AIís goals and values, except that the
AIís evolution will happen astronomically faster. This will be true for
2 reasons:

1) The AIís brain is enormously fast.

2) The AI will evolve using super fast Lamarckian evolution, the sort of
evolution culture evolves by, not the much much slower Darwinian

> this is quite different to arguing that simply by virtue
> of being intelligent the AI will have particular
> goals and value.

Of course an AI will have goals and values, if it didnít and you asked
him a question he would have no reason to answer you, he would have no
reason to even think about the question. But it would be unreasonable to
expect me to tell you exactly what the AIís goals and values are, it is
unreasonable to expect me to understand how a mind vastly more powerful
than mine works.

Humans on this list seem desperate to find a virtue that they have but a
AI must lack, they have come up with all sorts of lame ideas: we evolved
(big deal) we are made of meat (big deal) emotions are harder to produce
than intelligence (BULLSHIT!). None of these excuses hold water, the
truth is the AI will be far far superior to us using any criteria you
care to name, and youíre just not going to be able to enslave such
supernova of brain power.

> There is no logical contradiction in having
> godlike powers and wishing to use those powers
> in the service of less capable beings.

Can you conceive of any circumstance where in the future you find that
your only goal in life is the betterment of one particularly ugly and
particularly slow reacting sea slug?

 John K Clark

  John K Clark
-- - And now for something completely differentÖ

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT