Re: Singularity Arrival Estimate

From: Eugen Leitl (eugen@leitl.org)
Date: Fri May 17 2002 - 10:12:12 MDT


On Fri, 17 May 2002, Dani Eder wrote:

> > Progress measured in what?
>
> The rate of change of technology in general.

Which technologies, specifically? Rate of change measured in what? There's
a succession in core technologies powering progress in a given period of
time, so your metric will necessarily either have limited applicability in
time, or be complex, including a list of technologies, quantities
expressed in same metric, impact factors, and the like.
 
> > Can you cite any data backing up this idea of yours?
>
> The pace of technology change has increased over human history, as has
> the human population. Since the human brain hasn't changed size
> significantly in the past 10,000 years, it is reasonable to assume the
> average 'inventiveness' hasn't either. Thus the rate of generation of

This is not a reasonable assumption. While the hardware has remained a
constant over the last several ~10 kYrs, cultural evolution has not. Both
the allocation of resources, the quality and quantity of resources
available has changed.

I agree with you in principle, but I disagree that you can force the
explanation into a monocausal framework, and into a metric as simple as
"bits/s processing rate". This way lies bad science.

> new ideas, and thus the pace of progress, would be simply proportional
> to the human population. This is the simplest hypothesis.

Rules similiar to economies of scale seem to indicate that the payoff is
overproportionally greater in larger populations. I presume you could
easily pull up hundreds of papers from a range of disciplines (mostly
economics, probably) from a Google session, including those with enough
real world data.

I'm objecting to a reasoning based on pure armchair conjecture, while the
tools for production of significantly better models are literally at your
fingertips.
 
> In general, you can create a range of hypotheses or models in which
> the rate of progress is a function of quantity of intelligence and
> quality of intelligence. In the example above, it is assumed that
> only quantity contributes. In the second example from my previous
> post, only the quality factor was assumed to contribute. I think in
> reality there is a contribution from both factors. So to come up with
> my projection, I took both extremes and averaged them.

As long as there is no metrics, and no realworld data to fit your model
there's very little point in building models based no real-world
constraints.

> The 10^17 figure is the number of synapses in the brain (10^15) times
> the average firing rate of the synapses (100 Hz). The uncertainty is

The assumptions here are that the synapse is the processing element, and
that the spike firing rate is identical to bitrate. Both are not safe
assumptions.

> whether there is a lot of redundancy in the brain's wiring or whether
> a synapse firing represents more than one bit of data, and also how
> much of the brain is used for running the rest of the body and thus
> might not be needed for pure thinking.

What is "pure thinking" please? Why should "pure thinking" be relevant for
real-world AI?
 
> While I don't know which hypothesis is correct, the interesting thing
> to me is they lead to similar answers as to timeframe.

That's not surprising, since your reasoning is based on an exponential
progress model.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT