OCD (was Re: [sl4] AI's behaving badly)

From: Tim Freeman (tim@fungible.com)
Date: Mon Dec 08 2008 - 04:32:13 MST


From: "Petter Wingren-Rasmussen" <petterwr@gmail.com>
>A person with OCD might check if he turned off the stove several
>hundred times because each time he gets a little bit calmed down. In
>the long therm however, spending 3 hours a day checking the stove
>makes him think more about the possibility that he hasnt turned it off
>and raises the anxietylevel.

That's a good example. Suppose I have OCD, I want to drive to work,
and I keep getting into my car, getting out, going back into my house,
checking the stove, and getting into my car again. The AI observes
all this, and it controls the working of my car, the door to my house,
it can talk to me, and it knows an effective therapy for OCD. If
asked whether I want a fix, it correctly estimates that my odds of
wanting the fix are 50%. What would a Friendly AI have to do in that
case to deserve to be called Friendly?

My best guess is that it should let me run around in circles until I
find my own way out. If I asked for some effective therapy for OCD,
it might give it to me, or maybe it asks me if I want the therapy.
Giving me the therapy without my permission does not seem Friendly.

I would prefer to have some criterion by which OCD is wrong and, for
example, wanting sex without wanting to make babies is right, but I
don't see it.

-- 
Tim Freeman               http://www.fungible.com           tim@fungible.com


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT