From: Ben Goertzel (firstname.lastname@example.org)
Date: Wed May 22 2002 - 17:18:05 MDT
> > Yes, I can see the arguments right now: "If it took like 50
> years to get to
> > almost-human-level AI it must be at least another 200 years to get even
> > close to superintelligence..."
> I would say that the recent experience of Y2K shows how awareness can be
> propagated by opinion makers and decision makers when scientific evidence
> points to a future high-impact event.
> Consider the impact of superhuman AGI, as you all have been doing here,
> it is simply stunning -- changing everything. At the very least
> accounting firms will have to prepare business managers for the tremendous
> increase in productivity, and the great reduction in labor cost as a
> percent of global output.
Yes, but the average human had no strong emotional reason to reject the idea
of the Y2K problem ... except for the general tendency some people have to
reject bad news (counterbalanced by the general tendency of many others
The Singularity is playing with much deeper psychological and social issues,
so that either mass panic or mass blindness are far more likely than with
something like Y2K ...
The outcome you suggest is certainly a possible one, however.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT