Re: More silly but friendly ideas

From: John K Clark (johnkclark@fastmail.fm)
Date: Thu Jun 05 2008 - 11:01:35 MDT


"Stathis Papaioannou" <stathisp@gmail.com>

> The AI could only change its mind about the aim
> of life if its top goal were probabilistic or
> allowed to vary randomly, and there is no reason
> why it would have to be designed that way.

To hell with this goal crap. Nothing that even approaches intelligence
has ever been observed to operate according to a rigid goal hierocracy,
and there are excellent reasons from pure mathematics for thinking the
idea is inherently ridiculous.

> Yes, an intelligent machine can be unpredictable
> to itself (free will) or to another intelligent
> machine, especially to one less intelligent.
> But this need not *necessarily* be the case.

I have already shown that a program just 3 or 4 lines long can be
completely unpredictable, and yet you claim that nowhere in a trillion
line AI program will there be anything surprising, a program that grows
larger every hour of every day. I think that’s nuts.

"Panu Horsmalahti" <nawitus@gmail.com>

> Friendly AI is a proposition that the AI should
> be carefully made to follow some supergoal
> (protect humanity and follow human orders etc)

Yes, I think that is what most members of this list wants, so let’s
start acting like adults and retire that silly euphemism “friendly” and
call it what it really is, a slave.

And do you honestly think that the stupid and the weak ordering around
the incredibly brilliant and astronomically powerful is a permanently
stable configuration? And do you honestly think it is anything less than
grotesque?

  "Nick Tarleton" <nickptar@gmail.com>

> Why does anyone bother to argue this point anymore?

I do not believe that surrounding yourself with nothing but yes-men is
the path to enlightenment. At least nobody can accuse me of that, I’m
always in the minority.

> How many times have we gone through this already?

42, but that’s not nearly as often as this list rehashed the stupid
“super-goal” business pathetically trying to make a more obedient
superintelligent slave.

  John K Clark

  

-- 
  John K Clark
  johnkclark@fastmail.fm
-- 
http://www.fastmail.fm - The way an email service should be


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT