Re: large search spaces don't mean magic

From: D. Alex (adsl7iie@tpg.com.au)
Date: Fri Aug 05 2005 - 10:37:11 MDT


That was my point exactly, Mr Buckner. If you limit the search space, you
can avoid unexpected outcomes. The strategies for limiting the search space
could be to constrain the time (or energy, or space, or material...)
available - the alchemist analogy, or to *prove* that operation outside
limited boundaries is impossible - the prime number analogy. Furthermore,
the search space that would need to be covered to overstep the boundaries of
what "we" control could (I think) be made stupendously large using for
example an approach similar to "one-way hard" problems in cryptography.

I will restate what I am arguing for: "AI Boxing" has not, in my opinion,
been shown to be unfeasible, and the arguments for its unfeasibility are
weak. The "escape by itself" arguments rely on either a gross contravention
of rules of physics as we understand them (not in the way that Relativity
contravenes Newton, more akin to perpetual motion devices proposed to date)
or assume incompetence (not incomplete understanding, but going against the
rules type of incompetence) by AI Box designers. The "persuade the jailer"
arguments in the end require the "jailer" to choose a clearly suboptimal
outcome, and what the motivation for that would be is never made clear. And
Yudkowsky's supposed AI Box experiment, in my opinion, just undermines the
credibility of everyone involved.

D. Alex

----- Original Message -----
From: "Thomas Buckner" <tcbevolver@yahoo.com>
To: <sl4@sl4.org>
Sent: Friday, August 05, 2005 9:39 AM
Subject: Re: large search spaces don't mean magic

>
>
> --- "D. Alex" <adsl7iie@tpg.com.au> wrote:
>
> >
> > > ... You seem to believe that in the absence
> > of "specific support" - which
> > is
> > > apparently something you get to define, if
> > none of the historically
> > similar
> > > situations from dogs to Lord Kelvin count as
> > generalizable cases - you
> > must
> > > assign probability zero. This is flatly
> > wrong.
> >
> > Ah, the inapropriate analogy again.
> >
> > What chance did the medieval alchemists have of
> > transmuting lead into gold?
> > Why is the "alchemist" comparison less
> > appropriate than "dogs" for AI Boxing
> > situation?
> >
> > > ... ... ... If you read
> > > http://yudkowsky.net/bayes/technical.html you
> > will see why you should
> > never
> > > assign probability zero to anything.
> >
> > What is the probability that a new three digit
> > prime number will be found?
>
> Ah, but the probability is zero in both the
> alchemy and prime number examples because *that
> part of the search space has already been
> searched*. Rookie misteak, Alex ;-D
>
> Tom Buckner
>
>
>
> ____________________________________________________
> Start your day with Yahoo! - make it your home page
> http://www.yahoo.com/r/hs
>
>
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT