Re: Singularity skeptics

From: Charles D Hixson (charleshixsn@earthlink.net)
Date: Thu May 11 2006 - 17:20:06 MDT


Aleksei Riikonen wrote:
> On 5/12/06, Joshua Fox <joshua@joshuafox.com> wrote:
>> SL4 Members: Who would you list as the top "Singularity skeptics"?
>>
>> Though Stanford seems to define "Singularity skeptic" as someone who
>> thinks that we
>> should actively try to slow technological acceleration, I'm actually
>> interested in learning
>> of the top thinkers whose opinion you respect but who have argued
>> that anything like a
>> Singularity is very unlikely.
>
> If what we mean by the word Singularity is the creation of
> smarter-than-human intelligence, then I'd say that no-one who argues
> that such an event is very unlikely can possibly be a top thinker :P
> (Unless s/he is very poorly informed of a number of topics, or lying
> about his/her real assessment of the situation for political reasons.)
>
> It would *almost* be possible for a top thinker to assign a negligible
> probability for the Singularity because of being convinced that an
> existential risk will manifest before we'll get that far, but I think
> our situation is still short of being hopeless enough. I hope it
> remains so...
Let's think about this definition a bit. I would argue that any well
run group is "smarter-than-human intelligence", especially if it
contains experts in several different fields who are listened to
respectfully.

I'm sure that it is possible to define what is meant, but I would think
that the proffered definition needs a bit of work. OTOH, in Vinge's
original paper one of the scenarios that he proposed DID involve groups
of people as integral parts of the "mind" of the singularity. So even
from the start it was intended that people be considered as a possible
component. I.e., the "super-human intelligence" would need to be not
the people, but rather some larger entity that contained them as components.

I would further suggest that one indispensable requirement for the
singularity to have been achieved is that the thought processes of this
super-human entity be considerably faster than those of a discussion
group. How much faster? Note that mailing lists have dramatically
speeded up the rate of information transfer, but we don't consider that
*they* have created the singularity. (Well, not usually. Don't forget
the character in Accelerando who argued that the Singularity occurred
when the first Internet packet was transmitted.)



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT