RE: Shock level 4 (was Re: META SL4)

From: Toby Weston (LordLobster@yahoo.com)
Date: Thu Apr 24 2008 - 16:51:47 MDT


That all sounds like a valid, internally consistent set of opinions.
But I still think SL4 sounds better.
We can but aspire.

- original message -
Subject: Shock level 4 (was Re: META SL4)
From: Matt Mahoney <matmahoney@yahoo.com>
Date: 24/04/2008 8:22 pm

--- Mike Dougherty <msd001@gmail.com> wrote:

> I have reviewed Shock Levels. There is currently nothing that mere
> mortals may discuss that is SL4. I spent a long time waiting for a
> discussion that was truly on-topic for the list.
>
> Is it even possible for an SL4 thread to be discussed?
>
> I'll wait for an SL4 topic before posting again.

I also reviewed http://www.sl4.org/shocklevels.html
I will try. First I adjust my belief system:

1. Consciousness does not exist. There is no "me". The brain is a computer.
2. Free will does not exist. The brain executes an algorithm.
3. There is no "good" or "bad", just ethical beliefs.

I can only do this in an abstract sense. I pretend there is a version of me
that thinks in this strict mathematical sense while the rest of me pursues
normal human goals in a world that makes sense. It is the only way I can do
it. Otherwise I would have no reason to live. Fortunately human biases
favoring survival are strong, so I can do this safely.

My abstract self concludes:

- I am not a singularitarian. I want neither to speed up the singularity nor
delay it. In the same sense I am neutral about the possibility of human
extinction (see 3).

- AI is not an engineering problem. It is a product of evolution (see 2).

- We cannot predict the outcome of AI because evolution is not stable. It is
prone to catastrophes.

- "We" (see 1) cannot observe a singularity because it is beyond our
intellectual capacity to understand at any pre-singularity level of intellect.

- A singularity may already have happened, and the world we observe is the
result. We have no way to know.

Discussions about friendliness, risks, uploading, copying, self identity, and
reprogramming the brain are SL3. SL4 makes these issues irrelevant.

-- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT