From: Philip Sutton (Philip.Sutton@green-innovations.asn.au)
Date: Tue Feb 24 2004 - 08:40:15 MST
> 1) Any worldwide catastrophe has the power to prevent the Singularity,
> delay the Singularity, or increase the odds of an unfriendly
I think this is an important conclusion.
> 2) The tasks involved in preventing or surviving such catastrophes (if
> they don't arise from Singularity-enabling technologies), or minimizing
> their impact on the pre-Singularity world, do not directly engage with
> the problems and dangers peculiar to the Singularity.
I think this conclusion could well be wrong. I'm particularly interested in
the development of AGI because, if reliably friendly, I think AGIs could
make a really major contribution to helping us solve a host of current
critical issues. (Although I'm less familiar with some of the other 'SL4
technologies' I suspect the same argument applies.)
> 3) If you choose to devote enough time and energy to really make a
> difference to major SL<4 issues, you will not make a difference to SL4
> issues, and vice versa.
I think this is a very ambiguous statement - I can't tell for sure whether
you mean if society as a whole or individuals were to devote time to
SL<4 issues. For society as a whole the conclusion doesn't hold -
society as a whole has to deal with problems from 0 < SL < infinity.
For individuals - they need to specialise to some extent otherwise they
get spread too thin - but fruitful action paths might involve action across
several shock levels. That's what I find for my own work at any rate.
Green Innovations Inc.
195 Wingrove Street
Fairfield (Melbourne) VIC 3078
Tel & fax: +61 3 9486-4799
Victorian Registered Association Number: A0026828M
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:46 MDT