Re: AGI Prototying Project

From: Russell Wallace (russell.wallace@gmail.com)
Date: Mon Feb 21 2005 - 06:21:28 MST


On Mon, 21 Feb 2005 22:30:17 +1100, Tennessee Leeuwenburg
<tennessee@tennessee.id.au> wrote:
>
> Please let me know if I'm being too verbose, or talking rubbish.

You're talking rubbish, but you're not being too verbose :) That is,
while I disagree with the content of what you say, your style is fine;
please keep it up.

> Most people here have probably
> covered this before, but I still have to prove my grounds.

The topic has been discussed before, though of course the list
archives are extensive and finding messages on a particular topic may
not be trivial; still, Google could probably dig some up for you.

> There are four major ways to be frightened by AGI that come to mind now,
> ~ only one of which I think is worth worrying about.
>
> 1) Skynet becomes self-aware and eats us
> 2) AGI kills us all in our own best interests. How better to eliminate
> world hunger?
> 3) AGI needs our food, and out-competes us. Bummer.
> 4) AGI destroys our free will

(I don't see any difference between 1 and 3, but you seem to, so I'll
address 3, which is the one on which we most strongly disagree.)

> I am only worried about (1). I can imagine (3) happening, but I don't
> object to it. Survival of the fittest is how I got here, and damned if
> I'm going to starve to death for the sake of some rats. I think it's
> fair enough to apply the same standard to something smarter and higher
> on the food chain.

I used to think that too, until I thought about it a bit more clearly
and realized that the end point of evolution would not be sentient.
Let survival of the fittest run to its ultimate conclusion and you'll
have something that might be considered intelligent - at least, it'll
be better than any human at solving some types of engineering
problems, for example - but there will be nothing it will be like to
be that thing. There'll be a universe full of marvellously intricate,
optimally self-replicating nanomachinery - and nobody there to make
use of it.

Still happy with that?

> * We should realise that evolution can be made to work for us by
> building an AGI ecosystem, and thus forcing the AGI to survive only by
> working for the common interest

And the above is why doing things on this basis turns out to be such a
horribly bad idea.

- Russell



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT