From: Ben Goertzel (firstname.lastname@example.org)
Date: Sun Jun 23 2002 - 07:56:46 MDT
> Maybe the Singularity will *not* be a projected positive feedback
> loop, but
> if so it will be faster, not slower. I am uncertain as you, or
> more so, but
> my uncertainty is a volume centered around the positive feedback
> loop that
> seems less odd for an AI than the strange human way of doing
> things, not a
> volume centered around the human world.
Anyway, as I keep saying, this thread began, not to debate whether the
posited positive feedback loop will exist (though this is not certain, it's
an hypothesis you and I both accept), but to debate the *quantitative
parameters* of this positive feedback loop.
So far, nothing that you have said has specifically addressed this issue.
I.e., nothing but your own unsupported intuition has been offered in favor
of your posited "one month upper bound" for the transition from human-level
AI to vastly superhuman AI.
Nor have I offered any real evidence in favor of my own intuition that it
will be a bit slower than this!
What I am questioning is not your confidence that the feedback loop itself
will exist, but your confidence in your quantitative estimate of the speed
with which the feedback loop will lead to intelligence increase.
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT