From: James Higgins (email@example.com)
Date: Tue Jul 16 2002 - 13:56:54 MDT
Eliezer S. Yudkowsky wrote:
> James Higgins wrote:
>> I don't believe that to be the case. The Sysop scenario you, yourself,
>> suggested previously is highly immoral in my opinion. If you are
>> correct about Ben then you have both proposed immoral (subject to
>> perspective) goals.
> James, I never, ever, ever suggested explicitly programming in the Sysop
> Scenario. I thought (and still think) that imagining an FAI having to
Explicitly programming in vs endorsing is only slightly meaningful. And
I do believe that you endorsed the Sysop scenario. If I am mistaken
please let me know.
> serve as the OS of a universe provides an extreme way to test your
> conception of morality - it's a more stringent test than we apply to the
> morals that humans use to interact with each other. But that wasn't
> originally the rationale of talking about the Sysop Scenario. The point
> of the Sysop Scenario is that it provides a concrete disproof of these
> three claims:
I tend to think of the Sysop scenario as a different kind of evil, and
thus not solving the listed problems - just replacing them with a
> The Sysop Scenario is not the only possible solution to any of these
> three problems. It just demonstrates that at least one humanly
> conceivable solution exists.
Since I don't believe it is a solution to any problem, just a different
problem, I'd be happy to discuss any of these other possible solutions.
> But apparently rational discussion of the Sysop Scenario just isn't
> possible, even on the SL4 mailing list.
I'm not going there...
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT