RE: How hard a Singularity?

From: Eugen Leitl (eugen@leitl.org)
Date: Sun Jun 23 2002 - 06:03:20 MDT


On Sat, 22 Jun 2002, Ben Goertzel wrote:

> I think cryonics could possibly be cracked before AI if $100 billion were
> put into each.

The real figure is probably less than $10 M$ if utilized efficiently. It's
not about developing a new procedure, but figuring out what is happening
with the best of current procedures (which are close to optimal).
 
> But cryonics research is harder to do on a shoestring budget, so many fewer
> people are working on it -- even fewer than on general intelligence, sadly!

I've burned two years of my life working on it. You'd be surprised what
can be done on a modest budget with proper motivation. The issue with the
cryonics community is that it has zero interest in validation, being quite
comfortable with cargo cult science.

> On the other hand, with virtually unlimited funding, I think AI would come
> about before robust anti-aging/anti-disease

Life extension research sees order of magnitude more funding than AI
development. This disparity (if any) is likely to grow wider, not less.
 
> Agreed, of course... but in Eugen's future, there could be laws preventing
> uploads from transcending.
>
> Yucky to think about!

To use an analogy, space travel is expensive, and astronauts are
hand-picked and receive considerable training. Given that uploading is
going to be at least that expensive initially, don't you think we should
exercise a little diligence, when picking the first crew? Especially,
since it is going to mean a lot to make space travel affordable to Uncle
Joe and Aunt Mary?



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT