Re: [sl4] What is the probability of a positive singularity?

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Wed Jul 23 2008 - 15:04:32 MDT


--- On Tue, 7/22/08, Charles Hixson <charleshixsn@earthlink.net> wrote:
> Now what's the probability of humanity surviving if a
> positive singularity does *not* occur?
...
> I rate the probability of humanity surviving in it's
> current form as extremely close to zero.

That is true if you assume that the only thing that will stop a singularity is human extinction. However, Bostrom outlined other types of existential failures. For example, if a nuclear war or plague killed all but a few hundred humans, with associated loss of technology, it could delay a singularity by thousands of years.

But those kinds of disasters are not what worry me.

Another possible scenario is that once we have the technology to reprogram our brains (either in-place or uploaded), that a fraction of humans won't go along. The brain is programmed to find the state x that maximizes utility U(x). In this state, any perception or thought will be unpleasant because it would result in a different mental state. The fraction that realizes utopia = death, who realize that evolution is smarter than you are, will be the ones that pass on their genes. There is a good reason that humans fear death and then die, but not all of us realize it (including SIAI, it seems).

A singularity is a competitive, evolutionary process, but with evolution greatly sped up through technology. I have argued this before, that agents cannot recognize (and therefore cannot deliberately design or create) agents of greater intelligence, for any level or reasonable definition of intelligence. There are no known classes mathematical problems that are provably hard to solve and easy to check, no IQ tests for adults over 200, and no Turing tests for gods. We don't get to choose the path that AI takes.

A singularity means the end of humanity's reign as the most intelligent species. Whether humans survive depends on the ethical system of the dominant lifeform, which we can neither control nor predict.

-- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT