From: John K Clark (email@example.com)
Date: Sun Nov 25 2007 - 11:35:05 MST
>> A program that looks for the first even number
>> greater than 4 that is not the sum of two primes
>> greater than 2, and when it finds that number
>> it then stops. When will this program stop, will
>> it ever stop? There is no way to tell, all you
>> can do is watch it and see what it does, and
>> randomness or chaos or the environment has
>> nothing to do with it.
David Picón Álvarez wrote:
> I can say when it will stop. It will stop when it
> runs out of memory. And that moment can be predicted.
Given X amount of memory you cannot predict if the machine will stop
before it reaches that point, all you can do is watch it and see what it
does; or flip a coin and guess.
"Harry Chesley" firstname.lastname@example.org Wrote:
> I believe I get it now: you mean that the AI
> is unpredictable from our perspective.
The AI is unpredictable even from its own perspective, just like us
sometimes it won’t know what it’s going to do next until it does it. And
that is the only definition of the term “free will” that is not complete
"Nick Tarleton" <email@example.com>
> It is impossible to prove statements about the
> behavior of programs in general, sure, but we
> can still construct particular programs we
> can prove things about.
Big programs? Programs that do interesting things? Programs capable of
creating a Singularity? Jupiter Brain category programs? Don’t be
"Stathis Papaioannou" <firstname.lastname@example.org>
> Perhaps you could explain how an AI which
> started off with the belief that the aim
> of life is to obey humans would revise this belief
Perhaps you could explain how it came to be that the beliefs of a 3 year
old Stathis Papaioannou are not identical to the beliefs of the Stathis
Papaioannou of today.
> but an AI with the belief that the aim of
> life is to take over the world would be
> immune to such revision.
I’m not saying it is. Perhaps Mr. Jupiter Brain will think human beings
are rather cute and throw us a bone every once in a while, or perhaps he
will get nostalgic thinking about the good old days, or perhaps he will
exterminate us like rats; my point was that Mr. Jupiter Brain’s decision
will be out of our hands. If a multi billion dollar corporation can’t
make Vista secure we’re not going to make a Jupiter Brain secure. (damn,
I shouldn’t have said that, now this thread is going to morph into an
orgy of Microsoft bashing)
John K Clark
-- John K Clark email@example.com -- http://www.fastmail.fm - IMAP accessible web-mail
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:01 MDT