Re: [sl4] Comparative Advantage Doesn't Ensure Survival

From: Aaron Miller (
Date: Tue Dec 02 2008 - 10:41:26 MST

@Stuart, what about "Above efficiency and all other goals, put first the
survival and overall health of the species *homo sapiens* and its direct
biological descendants [in the case of speciation]. Do not commit any action
that may conflict with this goal."

On Tue, Dec 2, 2008 at 3:38 AM, Stuart Armstrong <> wrote:
>> Yes. This is why it would be silly to design an AI without a robust
>> morality. I suspect that true friendliness is impossible, but it should
>> possible to achieve something better than "red in tooth and claw". Even
>> natural evolution usually does better than that.
> When the power difference is small, maybe.
> But I'll take the bait. Give me a robust morality (spelled out as
> clearly as possible) for an AI, and I'll give you a situation in which
> that AI effectively terminates us.
> (AI's that do nothing, or just commit suicice, etc... excluded, of course
> Stuart

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT