From: Tim Freeman (firstname.lastname@example.org)
Date: Thu Apr 10 2008 - 08:23:11 MDT
From: "Nick Tarleton" <email@example.com>
>Utility is not just how good something feels, it's how good I
>rationally judge something to be; it seems like I currently rationally
>judge 2*N deaths (say) to be twice as bad as N deaths for all N, and I
>would choose to modify myself to actually *feel* that difference and
>eliminate scope insensitivity...
That sort of altruism is exploitable even without considering absurdly
improbable hells. All I need to do to exploit you is breed or
construct or train a bunch of humans who want exactly what I want.
It's even better if they'll commit suicide, or perhaps kill each
other, if they don't get it. Then I provide evidence of this to you,
and you'll want what I want.
You need to fix the set of people you care about, rather than allow it
to be manipulated by an adversary. You can't affort to give others
the power to produce entities that you care about.
-- Tim Freeman http://www.fungible.com firstname.lastname@example.org
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT