RE: [sl4] Evolutionary Explanation: Why It Wants Out

From: rick.smith@ntlworld.com
Date: Fri Jun 27 2008 - 10:33:17 MDT


That's a silly point.

If you build a predicting machine, you don't need to predict what it's going
to do, you watch it do its thing and see what happens.

Absurd.

-Rick-

-----Original Message-----
From: owner-sl4@sl4.org [mailto:owner-sl4@sl4.org] On Behalf Of John K Clark
Sent: 27 June 2008 16:50
To: sl4 sl4
Subject: Re: [sl4] Evolutionary Explanation: Why It Wants Out

On Fri, 27 Jun 2008 "Stathis Papaioannou"
<stathisp@gmail.com> said:

> No, my point was that there is no a priori
> basis for saying X is more important than Y.

And my point was that you have no way, absolutely no way of knowing if
AI has a reason for doing X or not unless you simulated it on faster
hardware, and then you'd have no way of predicting if the faster machine
had a reason for doing X unless you simulated it on a even faster
machine, and then you'd have no way.

 John K Clark

-- 
  John K Clark
  johnkclark@fastmail.fm
-- 
http://www.fastmail.fm - Email service worth paying for. Try it for free
-- 
No virus found in this incoming message.
Checked by AVG. 
Version: 7.5.524 / Virus Database: 270.4.1/1517 - Release Date: 24/06/2008
20:41


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT