Re: AGI Prototying Project

From: Russell Wallace (russell.wallace@gmail.com)
Date: Mon Feb 21 2005 - 18:12:14 MST


On Tue, 22 Feb 2005 10:10:56 +1100, Tennessee Leeuwenburg
<tennessee@tennessee.id.au> wrote:
> | I used to think that too, until I thought about it a bit more
> | clearly and realized that the end point of evolution would not be
> | sentient.
>
> :s/would/might/g

I wish :) I am at this point unfortunately quite confident about this
conclusion; it could depend on hidden variables (though I don't think
so), but I'm fairly sure it doesn't depend on random factors (short of
divine intervention).

For previous discussion of this topic, use Google to search the SL4
archives for "RPOP" and "paperclip". (Yes, paperclip ^.^ You'll see
why when you run the search.)

> If the end point of evolutions is not sentient we are screwed, if it
> is sentient we are safe, subject on both sides to the vaguaries of
> horizon problems. This is a truism if you believe that all
> evolutionary paths are eventually explored. Evolution is not a
> circumventable process, we can only do our best to build a fittest
> organism which is interesting rather than not.

I'm not sure about this... maybe you're right, in which case we're
toast. But I think this one _is_ a matter of probability.

However, I'm going to suggest an equivalent of Pascal's Wager: If
evolution can't be circumvented, it doesn't matter what we do for good
or bad. If it can, then it does matter what we do. So I put it to you
that we should act on the assumption that I'm right and it's both
possible and necessary to circumvent evolution.

- Russell



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT