Re: Destruction of All Humanity

From: Phillip Huggan (cdnprodigy@yahoo.com)
Date: Mon Dec 12 2005 - 16:42:17 MST


#2 goal systems are fun for speculation, for all of us not working with the difficult and costly actual nuts 'n bolts of AGI engineering.
  I notice an analogy between AGI ideas and the evolution of government. We have democracy, this resembles CV. However, Collective Volition can be improved upon. Eliezer's CV essay remarks that no one should be at the mercy of another's arbitrary beliefs. If you make people more like they'd like to be, I think you are magnifying the bad in people too. Regardless, freedom and free-will are really at our core. Having an AGI enforce a simple Charter of Rights and Freedoms would ensure none of us are impinged upon, instead of damning the minority. The CV essay states that no one is in a wise enough position to make normative judgements about such things, but this is simply not true. There are plenty of people employed in social sciences who don't do much of value. But some of their products include very well thought out documents. One of the few books I've kept with me through my moves is titled "The Human Rights Reader". Also "The Canada Charter of Rights and Freedoms"
 http://laws.justice.gc.ca/en/charter/ is being used as a model in many developing nations. Obviously this is not an optimal goal system, but I think it is an improvement to CV. I don't know how difficult it would be to program an AGI to implement such a charter while still preserving or effecting/accelerating the many types of progress we seem to have open to us in the absence of AGI. Earth is bountiful enough that there aren't any tough ethical zero-sum dillemnas where an AGI actually would have to take essential-for-charter physical resources from one judged inferior person and give to another judged superior person, at least until just before the end of the universe.

Ben Goertzel <ben@goertzel.org> wrote:
> I'm not clear on your criteria for enacting/not enacting the AGI's
> recommendation - some sort of cost-benefit analysis? Benefit to outweigh
> your own extermination? What could the criteria be,,,

Obvious issues would seem to include:

1) How certain I am that the computer shares the same value system as I do

2) How certain I am in my own value system, in the way I identify and
analyze my value system, etc.

3) How much I trust the computer (based on all sorts of factors)

I admit I have not analyzed this issue very thoroughly as there are a
lot of nearer-term, relevant issues that are also difficult and need
thinking-through...

-- Ben
  

                        
---------------------------------
Yahoo! Shopping
 Find Great Deals on Holiday Gifts at Yahoo! Shopping



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT