Re: [sl4] Starglider's Mini-FAQ on Artificial Intelligence

From: Mike Dougherty (msd001@gmail.com)
Date: Thu Oct 08 2009 - 19:02:33 MDT


On Thu, Oct 8, 2009 at 9:59 AM, Matt Mahoney <matmahoney@yahoo.com> wrote:
>
> Forget it. An AI that models your mind could do anything you could.
>

if this AI models only my mind, it will have no reason to kill the
biological life it has subsumed. I imagine that after an upload, my
old-school flesh and blood self would still want to live out a human
lifetime. The uploaded self wouldn't care because its goals (are
capabilities) are instantly so different/disconnected from the flesh
that we're effectively different people anyway. I currently believe I
would let me live. I expect that immediate after upload I would still
let me live - in both contexts. So what does a human_v1 do with
itself after the techorapture? You've already suggested that
evolution has programmed us to fear death, so a few might suicide in
response to overwhelming change but not all. You're right, there's
nothing about my identity that can't be more cheaply simulated by the
AI - except maybe for presence. Those remaining human_v1 will
probably return to a tribal/communal living arrangement not of
necessity (there won't be necessity for much) but out of the choice to
'be' around each other.

> Depends what you mean by "better". You were created by evolution. It wasn't your idea to program yourself to fear death and then die. But evolution knows better.

Evolution doesn't really know anything. It's just patient.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:04 MDT