Re: [sl4] I am a Singularitian who does not believe in the Singularity.

From: John K Clark (johnkclark@fastmail.fm)
Date: Thu Oct 08 2009 - 11:25:23 MDT


On Thu, 8 Oct 2009 16:12:02 +0000, "Randall Randall"

> they're [FAI people] suggesting that there can and should be a highest-level
> goal, and that goal should be chosen by AI designers to maximize human
> safety and/or happiness. It's unclear whether this is possible

It's not unclear at all! Turing proved 70 years ago that such a fixed
goal (axiom) sort of mind is not possible because there is no way to
stop it from getting stuck in infinite loops.

> one of the reasons this is difficult is that humans do not
> appear to have a goal system which is structured in this way

At last something I can agree with. Humans have no permanent highest
goal, not even the goal for self preservation. The reason for this is
that evolution "figured out" that minds like that don't work about a
billion years before Turing did. And yes I'm using anthropomorphic
language again, so shoot me.

John K Clark

-- 
  John K Clark
  johnkclark@fastmail.fm
-- 
http://www.fastmail.fm - Email service worth paying for. Try it for free


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:04 MDT