Re: Happy Box

From: Stuart Armstrong (dragondreaming@googlemail.com)
Date: Sat May 03 2008 - 12:20:18 MDT


> No (super)goal is more "intelligent" than another excepting perhaps for
> self-consistency.

I wouldn't underestimate the possibility for AI goals to drift because
of inconsistency. We'd want the AI to care somewhat about the
happiness, survival and freedom of humanity; I doubt we will be able
to phrase these in a very consistent way.

Similarly, a CEV will probably not be consistent, and will evolve as
humans change, in probably inconsistent ways.

Stuart



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT