Re: Destruction of All Humanity

From: Ben Goertzel (ben@goertzel.org)
Date: Mon Dec 12 2005 - 15:46:32 MST


> I'm not clear on your criteria for enacting/not enacting the AGI's
> recommendation - some sort of cost-benefit analysis? Benefit to outweigh
> your own extermination? What could the criteria be,,,

Obvious issues would seem to include:

1) How certain I am that the computer shares the same value system as I do

2) How certain I am in my own value system, in the way I identify and
analyze my value system, etc.

3) How much I trust the computer (based on all sorts of factors)

I admit I have not analyzed this issue very thoroughly as there are a
lot of nearer-term, relevant issues that are also difficult and need
thinking-through...

-- Ben



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT