Re: Destruction of All Humanity

From: 1Arcturus (arcturus12453@yahoo.com)
Date: Mon Dec 12 2005 - 12:17:04 MST


Ben,
   
  Thanks. I don't know very much at all about SIAI or Mr. Eliezer so the opinion of someone who knows more than I do is something.
   
  It is an interesting hypothetical situation to consider because explanations are not obvious. (The hypothetical could be even more amped up if the AGI recommended the *immediate* destruction of all humanity.)
   
  I'm not clear on your criteria for enacting/not enacting the AGI's recommendation - some sort of cost-benefit analysis? Benefit to outweigh your own extermination? What could the criteria be,,,
   
  gej
  

Ben Goertzel <ben@goertzel.org> wrote:
  Hi,

I don't normally respond for other people nor for organizations I
don't belong to, but in this case, since no one from SIAI has
responded yet and the allegation is so silly, I'll make an exception.

No, this is not SIAI's official opinion, and I am also quite sure that
it is it not Eliezer's opinion.

Whether it is *like* anything Eliezer has ever said is a different
question, and depends upon your similarity measure!

Speaking for myself now (NOT Eliezer or anyone else): I can imagine a
scenario where I created an AGI to decide, based on my own value
system, what would be the best outcome for the universe. I can
imagine working with this AGI long enough that I really trusted it,
and then having this AGI conclude that the best outcome for the
universe involves having the human race (including me) stop existing
and having our particles used in some different way. I can imagine,
in this scenario, having a significant desire to actually go along
with the AGI's opinion, though I doubt that I would do so. (Perhaps I
would do so if I were wholly convinced that the overall state of the
universe would be a LOT better if the human race's particles were thus
re-purposed?)

And, I suppose someone could twist the above paragraph to say that
"Ben Goertzel says if a superintelligence should order all humans to
die, then all humans should die." But it would be quite a
misrepresentation...

-- Ben G

On 12/12/05, 1Arcturus wrote:
> Someone on the wta-list recently posted an opinion that he attribtuted to
> Mr. Yudkowsky, something to the effect that if a superintelligence should
> order all humans to die, then all humans should die.
> Is that a wild misrepresentation, and like nothing that Mr. Yudkowsky has
> ever said?
> Or is it in fact his opinion, and that of SIAI?
> Just curious...
>
> gej
>
> ________________________________
> Yahoo! Shopping
> Find Great Deals on Holiday Gifts at Yahoo! Shopping
>
>
  

                        
---------------------------------
Yahoo! Shopping
 Find Great Deals on Holiday Gifts at Yahoo! Shopping



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT