Re: How hard a Singularity?

From: Michael Roy Ames (michaelroyames@hotmail.com)
Date: Tue Jun 25 2002 - 21:01:43 MDT


James Higgins wrote:
>
> Hey, if you want to take solo risks of falling though the floor, be my
> guest. But when doing so could wipe out all life on earth, please
> don't. If for no other reason than if you were to fall though the floor
> there would be no one alongside to pickup the pieces and carry on (or
> effectively do damage control)...
>
> I'd much rather get a few thousand of the brightest people on the planet
to
> look at the floor for two years before you took that first step.

I am absolutely positive that Eliezer agrees with you on this... else why
would he have published his Friendly AI theories on the web? He obviously
*wants* "the brightest people" to read and blast holes in his stuff, if they
can. And to a certain extent this has happened (bright people have read it)
and comments have been offerred. I don't think that a "thousand" of the
brightest have read it.. but how would we get those kinds of people to spend
the time to do a thourough job? Got any ideas?

> Letting your ego prod you to go it mostly alone is insane given the
> circumstances.

Yep. That is why there is the Singularity Institute. Eliezer talks
confidently, but I don't see him as acting *over* confidently.

>
> Hell, if I had to give the answer to "623 + 377" but giving the wrong
> answer would kill, well, even ONE person I'd get like 10 other opinions
> before answering! And if even one of them was different I'd get 20 more,
etc.
>

Good for you! (and for us).

Michael Roy Ames



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT