Re: Happy Box

From: Samantha Atkins (sjatkins@gmail.com)
Date: Sat May 03 2008 - 10:22:39 MDT


Matt Mahoney wrote:
> --- Krekoski Ross <rosskrekoski@gmail.com> wrote:
>
>
>> Whats to say a computer doesnt one day 'try' a different set of
>> motivations,
>> as a simulation of sorts, to see if this new motivation set can help
>> the humans more efficiently...
>>
>
> Anything that the child AI does that differs from what the parent would
> have done will be "wrong" in the view of the parent.

Then there would be no point in such "children". One would only create
perfect copies of itself. This is equivalent to saying there is no room
for or need for any diversity or different POV forever. So this society
becomes a static windup clock ticking away to its inevitable conclusion
or going in the same circles forever.
> Someone more
> intelligent than both will need to decide. Unfortunately, when
> machines exceed human intelligence, that won't be an option.
>

Then that decider is the source of stasis.

- samantha



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT