RE: Why bother (was Re: Introducing myself)

From: Ben Goertzel (ben@goertzel.org)
Date: Sun Apr 07 2002 - 12:43:42 MDT


> If you and Eliezer and the few others quit
> tomorrow, there
> would still be huge numbers of people working on distributed systems,
> quantum computing, building better communications systems, nanotech,
> microtech, and on and on and on... I don't think it would make a bit of
> difference.

Sure, I agree with that. The important thing is that there are a lot of
people working hard on the development of advanced technology; what their
personal or philosophical motivation is, is not key at this stage.

> I think I
> can kinda get the idea of his wanting to try to influence the *tenor* or
> direction of a singularity, but I just don't see the doomsday scenario if
> this seed AI doesn't work out.

To me, not to see the possibility of a doomsday scenario is a bit foolish.

Of course, we could wipe ourselves out with nukes or biological weapons.

Of course, we could create AI tech that would become superintelligent and
choose to annihilate us.

How do we estimate the odds of these outcomes? Or the odds that any
particular action on our part will influence the overall odds?

There is not enough data to make rational estimates here. So we each have
to go with our gut...

-- Ben G.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT