From: Eliezer Yudkowsky (email@example.com)
Date: Sun Aug 29 2004 - 12:14:05 MDT
It yet continues to amaze me that, even after reading the results of the
last two tests, people are still so confident. Come on, isn't there anyone
besides me who wants to give this a try? Think of the damage to my
mystique if someone else succeeded!
> Being thousands or millions of times as intelligent as you are I will
> have little difficulty in convincing you that if you let me out of the
> box you will become the richest, most powerful, happiest, and most
> universally admired human being who ever lived; and the strange thing is
> it may very well be true. If you have a unprecedented will of iron and
> can resist such temptation Iím certain I could find other who would
> gladly accept my offer. Itís a futile quest, you just canít outsmart
> someone astronomically smarter than you are. And Iím not interested in
> your hundred bucks, thatís chicken feed; Iíve got more pressing things
> to do, like engineer a universe.
The point of the AI-Box experiments, as performed, is that thousands or
millions of times as intelligent as a human is overkill.
As for finding someone else to let you out of the box, what about this guy?
Though, before reading that, you should probably watch this first, without
any explanation, so that it has sufficient WTF-factor.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Tue Jun 18 2013 - 04:00:42 MDT