Re: An essay I just wrote on the Singularity.

From: Samantha Atkins (samantha@objectent.com)
Date: Fri Jan 02 2004 - 03:32:57 MST


On Wed, 31 Dec 2003 18:26:30 -0500
Randall Randall <randall@randallsquared.com> wrote:

>
>
> This is not at all true. I think it's quite arguable that
> strong superintelligence is impossible. For instance, it
> might be that physical law doesn't permit complexity above
> some ceiling. If so, it might be that the smartest possible
> humans are already very close to that limit.
>
Saying "it might be that physical law <blah>" is not at all illustrative of the point being arguable in any meaningful way. There is no reason to expect such to be the case at a level ruling out superintelligence and thus no reason to hold much stock in such a possibility. I am also confused by the phrase "the smartest possible humans". Do you mean "smartest humans occurring naturally without augmentation to date" or what? Now physical reality *does* place "some ceiling" or maximum possible intelligence, but that ceiling by all we know to date is comfortably far on the other side of SAI. The exact boundaries of what is the maximal possible intelligence in this physical universe is something that I doubt humans are intelligent enough to comprehend.

> This might not seem very likely to you, but unless you
> can logically rule it out, it isn't incompatible with
> a technologically induced singularity (in the weak sense).
>

It is not rational to require logically ruling out each and every weak hypothetical in order to assert the emptiness of an argument based on the weak "it might be possible that X" form. Proving an "it might be possible that" is in fact impossible is generally equivalent to proving a universal negative.

- s



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT