From: Mitch Howe (firstname.lastname@example.org)
Date: Tue May 14 2002 - 15:27:21 MDT
>However, one thing that at least seems to be certain is that speculation
concerning when a particular Singularity scenario might occur >contributes
precisely nothing towards the achievement of such an objective.
Not true, although the connection is rather indirect. Predicting when
certain Singularity scenarios might be possible is an important part of
short and long range planning for those who would help to bring it about.
If the Singularity is still WAY out there due to hardware limitations, for
example, then it doesn't make much sense to throw a lot of time and money at
building an AI in the near term; for the sake of humanity it would be better
to keep these projects incubating while devoting more resources to things
like cancer research and traditional charities. But if it seems that the
Singularity could be right around the corner if the right people and tools
were brought together now, then it makes a lot more sense to walk right on
past the Salvation Army Santa and drop your coins down the pocket protector
of your nearest pro-AI computer geek.
Building solid arguments for why the Singularity does, in fact, seem to be
so possible in the short term could go a long way towards securing funding
from the large group of financially independent individuals who see such
things as possible but too remote to directly concern themselves with.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT