Re: Destruction of All Humanity

From: micah glasser (micahglasser@gmail.com)
Date: Mon Dec 12 2005 - 22:49:03 MST


You seem to be indicating that an AI goal system should include the
governence of human beings. I think that this is a terrible mistake (please
disregard if I have misinterpreted). In my opinion the goal system problem
has already been solved by philosophical ethics. The goal is the greatest
amount of freedom for the most people. This implies, I think, that an AI
should be directed by the categorical imperetive just as humans are. The
only way to ensure that an AI will be able to succsesfully use this logic is
if its own highest goal is freedom. This is because the categorical
imperetive restricts actions that one would not will to be permissible in
general. The categorical imperative also restricts treating any rational
agent as only a means to an end - in other words as a tool. Therefor,
according to this ethical system, we must treat any AI life forms as people
with all the rights of people and demand that they treat other rational
agents the same way. This is a simple solution to an otherwise very
complicated problem. Its a fairly simple logic that can easily be proggramed
into an AI.

On 12/12/05, Phillip Huggan <cdnprodigy@yahoo.com> wrote:
>
> #2 goal systems are fun for speculation, for all of us not working with
> the difficult and costly actual nuts 'n bolts of AGI engineering.
> I notice an analogy between AGI ideas and the evolution of government. We
> have democracy, this resembles CV. However, Collective Volition can be
> improved upon. Eliezer's CV essay remarks that no one should be at the
> mercy of another's arbitrary beliefs. If you make people more like they'd
> like to be, I think you are magnifying the bad in people too. Regardless,
> freedom and free-will are really at our core. Having an AGI enforce
> a simple Charter of Rights and Freedoms would ensure none of us are impinged
> upon, instead of damning the minority. The CV essay states that no one is
> in a wise enough position to make normative judgements about such things,
> but this is simply not true. There are plenty of people employed in social
> scie! nces who don't do much of value. But some of their products include
> very well thought out documents. One of the few books I've kept with me
> through my moves is titled "The Human Rights Reader". Also "The Canada
> Charter of Rights and Freedoms" http://laws.justice.gc.ca/en/charter/ is
> being used as a model in many developing nations. Obviously this is not an
> optimal goal system, but I think it is an improvement to CV. I don't know
> how difficult it would be to program an AGI to implement such a charter
> while still preserving or effecting/accelerating the many types of progress
> we seem to have open to us in the absence of AGI. Earth is bountiful enough
> that there aren't any tough ethical zero-sum dillemnas where an AGI actually
> would have to take essential-for-charter physical resources from one judged
> inferior person and give to another judged superior person, at least
> until&nb! sp;just before the end of the universe.
>
> *Ben Goertzel <ben@goertzel.org>* wrote:
>
> > I'm not clear on your criteria for enacting/not enacting the AGI's
> > recommendation - some sort of cost-benefit analysis? Benefit to outweigh
> > your own extermination? What could the criteria be,,,
>
> Obvious issues would seem to include:
>
> 1) How certain I am that the computer shares the same value system as I do
>
> 2) How certain I am in my own value system, in the way I identify and
> analyze my value system, etc.
>
> 3) How much I trust the computer (based on all sorts of factors)
>
> I admit I have not analyzed this issue very thoroughly as there are a
> lot of nearer-term, relevant issues that are also difficult and need
> thinking-through...
>
> -- Ben
>
>
> ------------------------------
> Yahoo! Shopping
> Find Great Deals on Holiday Gifts at Yahoo! Shopping<http://us.rd.yahoo.com/mail_us/footer/shopping/*http://shopping.yahoo.com/;_ylc=X3oDMTE2bzVzaHJtBF9TAzk1OTQ5NjM2BHNlYwNtYWlsdGFnBHNsawNob2xpZGF5LTA1+%0A>
>
>

--
I swear upon the alter of God, eternal hostility to every form of tyranny
over the mind of man. - Thomas Jefferson


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT