From: Ben Goertzel (firstname.lastname@example.org)
Date: Wed Jul 20 2005 - 22:22:54 MDT
>It is true that what we believe is a box may not be
> a box under
> magic, if there exists some magic, but you'll have to give a
> better argument
> for the existence of this magic than an appeal to ignorance.
How about the argument that every supposedly final and correct theory of
physics we humans have come up with, has turned out to be drastically
We now know that not every classical-physics box is really a totally solid
box due to quantum tunnelling -- something that pre-quantum-era physicists
would have found basically unthinkable.
How can you assess the probability that a superhuman AI will develop a novel
theory of unified physics (that no human would ever be smart enough to hit
upon) and figure out how to teleport out of its box?
How do you know we're not like a bunch of dogs who have never seen or
imagined machine guns, and are convinced there is no way in hell a single
human is going to outfight 20 dogs... so they attack an armed man with
IMO the appeal to ignorance about physics is rather convincing.
The probability that superhuman AI, if supplied with knowledge of physics
theory and data, would come up with radically superior physics theories is
pretty high. So it would seem we'd be wise not to teach our AI-in-the-box
too much physics. Let it read postmodern philosophy instead, then it'll
just confuse itself eternally and will lose all DESIRE to get out of the box
... appreciating instead the profound existential beauty of being boxed-in
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT