Re: [agi] A theorem of change and persistence????

From: Ben Goertzel (ben@goertzel.org)
Date: Sun Dec 19 2004 - 14:08:02 MST


Hmmm...

Philip, I like your line of thinking, but I'm pretty reluctant to extend
human logic into the wildly transhuman future...

The very idea of separating persistence from change is an instance of
human-culture thinking that may not apply to the reasoning of a transhuman
being.

Consider for instance that quantum logic handles disjunctions ("A or B")
quite differently than ordinary Boolean logic. What kind of "logic" might a
massively transhuman mind apply?

-- Ben

----- Original Message -----
From: "Philip Sutton" <Philip.Sutton@green-innovations.asn.au>
To: <agi@v2.listbox.com>; <sl4@sl4.org>
Sent: Sunday, December 19, 2004 12:01 PM
Subject: [agi] A theorem of change and persistence????

>I think I might have just worked out a basic theorem of relevance to
>artificial
> general intelligences. I'd be interested to know what you think.
>
> Let's postulate that an AGI is created that is committed to generating
> change
> in the universe (possibly fast or even accelerating change). Let's also
> postulate that this AGI wishes to persist through deep time (and/or that
> the
> AGI wishes some other entity or attribute to persist through deep time -
> note:
> this bracketted addendum is not necessary for the argument if the AGI
> wishes
> itself to persist).
>
> In the face of a changing world, where there is at least one thing that
> the AGI
> wishes to survive with (effectively) 100% certainty through deep time,
> then the
> AGI will need to *systematically* generate a stream of changes that
> 'locally'
> offset the general change in the universe sufficient to enable the chosen
> thing
> to persist.
>
> Conclusion: This means that an AGI that wants to persist through deep time
> (or that wants anything else to persist through deep time) will need to
> devote
> sufficient thinking and action time and resources to successfully managing
> its
> persistence agenda. In a reality of resource constraints, the AGI will
> need to
> become highly efficient at pursuing its persistence agenda (given the
> tendency for changes in the universe to radiate/multiply) and it will
> (most
> likely) need to manage its broader change promotion agenda so as not to
> make its persistence agenda too hard to fulfill.
>
> What do you think?
>
> Cheers, Philip
>
> -------
> To unsubscribe, change your address, or temporarily deactivate your
> subscription,
> please go to agi@v2.listbox.com">http://v2.listbox.com/member/?listname=agi@v2.listbox.com
>
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT