Re: Goals, but not football (was re: Happy Box)

From: Stuart Armstrong (dragondreaming@googlemail.com)
Date: Wed May 07 2008 - 07:10:45 MDT


> > Similarly, an AI with a goal like "ensure that every human being
> > survives, is happy and feels worthwhile, and beyond that obey human
> > instructions", would probably not stumble towards paralysis or rote,
> > even if that goal remains forever unchanged.

> If no, how do you propose to program the goal system so that these and
> a million other unintended consequences that you didn't anticipate
> don't happen?

I don't - that goal system was not a serious attempt at anything, just
an illustration of a single goal system that need not be stagnating.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT