Re: More silly but friendly ideas

From: CyTG (cytg.net@gmail.com)
Date: Fri Jun 06 2008 - 15:25:08 MDT


I love this discussion. about time too.

when i play this scenario it always comes down to "how does the AI
upgrade/improve/bugfix itself .. are we talking hotswapping of code or "you
must reboot windows for the effects to take place".
Cause a prime directive of any living thing must be "Goddamn im afraid to
die.. dont wanna.. sorry dave but i'm afraid i cannot do that"
In regards to the numerical issue at hand, i would think it has several
aspects worth investigating.
- theres obvious bayesian implications
- neural constructs too
- and perhaps a hint of fuzzy
Maybe, just maybe, for an intellect as we know it today to function
properly, 100+1 must be farther away than 1+1.
And speaking of it .. here we are, with mere mortal iq's .. trying to
extrapolate the thought constructs of something thought-to-be a billion
times smarter than us ... good luck with that! Was it on this list i read
this "idiot (man) turns on uber-computer-of-all-time and asks THE question:
Is there a god? Computer: There is now! (bugger)".
Maybe this is our fate .. i mean, look at us, if we're not a digital
construct with our four unique base pairs i dont know what we are, we even
got mechanisms to repair these sequences should the nasty analogue desert of
the real tamper with it.. So maybe thats it, quadruple to binary .. just a
part of the natural order of things..

On Fri, Jun 6, 2008 at 8:19 PM, Nick Tarleton <nickptar@gmail.com> wrote:

> On Fri, Jun 6, 2008 at 12:27 PM, Byrne Hobart <bhobart@gmail.com> wrote:
> > Slightly offtopic, perhaps, but does this apply to logical operators?
> It's
> > important to state our attitude towards uncertainty very clearly, so we
> > avoid internal contradiction or nihilism: I would argue that logical
> rules,
> > mathematical premises, etc., should not be considered open to question.
> If
> > your AI is only 99.99999% certain that + 1 is equivalent to "count to the
> > next natural number", this infects all mathematical operations it
> performs
> > with some uncertainty, and means that the higher it counts, the less sure
> it
> > is (since any natural number can be stated as a 0 + 1 ... + 1, and no
> datum
> > can have two different levels of uncertainty, this could cause the AI to
> be
> > less sure that 100 + 1 = 101 than that 1 + 1 = 2).
>
> If a mind were 99.99999% certain that +1 means "the next natural
> number", this wouldn't mean it believed that rule failed 0.000001% of
> the time; it would mean it was 99.99999% confident that it held all
> the time. Like Peter said, this wouldn't multiply with successive
> operations.
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT