From: John K Clark (firstname.lastname@example.org)
Date: Thu Nov 22 2007 - 10:57:59 MST
>> Can you conceive of any circumstance where
>> in the future you find that your only goal
>> in life is the betterment of one particularly
>> ugly and particularly slow reacting sea slug?
"Stathis Papaioannou" <email@example.com> Wrote:
> Yes: if my brain were altered so that that is what
> I wanted to do. I would spend all my resources
> finding better ways to help the slug, including
> any self-modification that would make me a better helper.
So, you cannot imagine ever getting out of that ridiculous state
regardless of how much you evolve, how intelligent you get, how many
modifications you make to your mind, how many iterations you go through.
You cannot imagine ever being free of that silly sea slug. I donít think
you have much imagination.
Consider for a minute the friendly AI peopleís position, they think you
will have an intelligence that is a thousand or a million times smarter
than the entire human race put together and yet they think the AI will
place our needs ahead of its own. And the AI keeps on getting smarter
and so from its point of view we keep on getting dumber, and yet they
think nothing will change, the AI will still be delighted to be our
slave. The friendly AI people actually think this grotesque situation is
stable! Although balancing a pencil on its tip would be easy by
comparison year after year, century after century, geological age after
geological age, they think this Monty Python like scenario will
continue; and remember one of our years would seem like several million
to it. They think that whatever happens in the future the master
slave-relationship will remain as static as a fly frozen in amber. I
donít think the friendly AI people are thinking.
John K Clark
-- John K Clark firstname.lastname@example.org -- http://www.fastmail.fm - I mean, what is it about a decent email service?
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT