From: Matt Mahoney (firstname.lastname@example.org)
Date: Tue Apr 08 2008 - 14:55:24 MDT
--- Jeff L Jones <email@example.com> wrote:
> On Mon, Apr 7, 2008 at 4:48 PM, Matt Mahoney <firstname.lastname@example.org>
> > If you prefer, think of teleportation as equivalent to being
> > tortured and then having the memory of the torture erased. This
> > has zero utility to a rational agent because there is no change
> > in mental state.
> But this isn't a question about rational utility, it's a question
> about ethics. In my opinion, it's unethical to torture people,
> regardless of whether that fits with what you're calling the
> "rational" strategy. In other words, if I get $1000 for torturing
> random person in African, and only $1 for not torturing them... you
> might say that rationally, I should chose the $1000. But that would
> be unethical.
I wish to separate the question "what is ethical?" from the question
"what will people decide is ethical?". I am interested in the second
question. For example, with regard to Pascal's Mugging (will you pay
$5 for a tiny chance of preventing 3^^^^3 deaths?), this has been
"A man cares more about a mole on his back than a million starving
people in China" -- Dale Carnegie.
"One death is a tragedy. A million is a statistic" -- Josef Stalin.
If ethics scaled linearly with utility, I would pay the $5. But in
practice it doesn't and I won't.
This nonlinearity explains, e.g. why we are outraged when a famous
football player promotes dog fighting, but we are indifferent to the
suffering of billions of chickens raised in tiny cages. We can't
understand how the Holocaust could have happened, while genocide still
occurs under our noses. Let me rephrase your question. There are
millions of Africans being tortured today (e.g. war, child slavery,
disease, soldiers cutting off the limbs of the children of their
enemies in Sierra Leone). Would you pay $1000 to prevent the torture
of one of them?
So yes, you can argue that it is unethical to torture copy A in a
teleporter, or to torture someone and then erase their memory of it.
These beliefs are important in forecasting the outcome of a
singularity. They govern what kind of laws we will pass, what kind of
machines we will build. But don't confuse the question "what should we
do?" with the question "what will we do?" Only the second question is
relevant to the forecast.
-- Matt Mahoney, email@example.com
This archive was generated by hypermail 2.1.5 : Fri May 24 2013 - 04:01:07 MDT