Re: Volitional Morality and Action Judgement

From: Michael Roy Ames (michaelroyames@yahoo.com)
Date: Sat May 29 2004 - 19:45:06 MDT


Mark Waser,

You wrote,
>
> Ah. I wasn't clear. What I was envisioning is a
> set-up with multiple FAIs where none of them are
> permitted to take an action unless all agree.
> Having three human beings with nuclear keys (all
> of which are required to fire the missiles) is
> less risk than having two or one.
>

Having multiple independent AIs with different histories is an already
acknowledged good idea. It is a matter of definition as to whether such a
group of reasoners would be considered a single FAI or not. I suspect that
this kind of redundancy would be highly beneficial in avoiding errors.

>
> I'm not relying on Eliezer. I do think that
> it's unfortunate that others apparently are.
> And, as I've said, I have volunteered to donate
> myself.
>

Kudos to you. I have also, and do.

>
> Watching Eliezer refuse to give the time of day to
> those individuals who appear most likely to be most
> capable of assisting him (or developing FAI on their
> own) is most frustrating.
>

I think Eliezer's response to this question clarifies the situation
considerably. To my personal knowledge, those individuals that have shown
significant understanding to the FAI ideas, and contributed either
significant insights, or pointed out problems, have received considerably
more than 'the time of day'. This includes Ben, by the way.

Michael Roy Ames.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT