Re: physical pain is bad (was Re: Dynamic ethics)

From: Russell Wallace (russell.wallace@gmail.com)
Date: Mon Jan 23 2006 - 17:31:41 MST


On 1/23/06, Jeff Medina <analyticphilosophy@gmail.com> wrote:
>
> I've yet to see a decent argument for why this would be a bad thing.
> I've seen a whole lot of responses along the lines of "I get to choose
> for myself what I want to do!" and "We must respect the wishes of
> autonomous, intelligent, rational adult humans!", and both of these
> fail.
>
> The first reply is effectively the same argument a puppy gives when
> you take it to the vet, or a child gives when you make it eat its
> vegetables. And we near-universally agree that it doesn't matter that
> the puppy or the child or the
> other-less-intelligent-less-rational-sentient-being thinks it wants;
> we know better (most of the time), and we impose our will for the good
> of the "lesser" being.

This is circular logic. You're claiming the conclusion "transhumans have the
right to treat humans like human adults treat children", but your argument
for it is "humans are to transhumans as children are to human adults"; so at
the very least you have provided no non-circular argument _for_ your
position.

Here are some arguments against:

Children are specifically adapted for an environment where their volition
will at times be overridden by adults; they are not as they would need to be
if they were free to make their own decisions at all times. Human adults are
_not_ similarly adapted to have their volition overridden by transhumans.

As a matter of objective fact, an 8 year old whose volition is sometimes
overridden by his parents will grow up happier - and more grateful for the
results - than one whose volition never is. The counterpart for humans
versus transhumans is not only unproven, but a null argument even if it were
true (on the grounds of being too trivial a criterion to meet: wireheading).

Children just have to live with the way things are; they don't get a say in
designing their parents. We do get a say in designing the entities that will
exist in our world next century.

The second reply is a variation on the first, but requires more
> comment. Specifically, it holds up the *current* level of autonomy,
> intelligence, or rationality most humans exhibit as sacrosanct, an
> in-practice binary distinction between our level and that of "lesser"
> beings. But one of the key realizations leading to transhumanism is
> that there is nothing special or sacred about humans-as-they-are-now
> in and of itself.

This isn't a "realization", but a disagreement with a moral axiom. I value
humanity in and of itself - it constitutes my primary reason for wanting the
Singularity to proceed and go well, and my sole reason for wanting it so
badly that I'm willing to devote my life to trying to increment the
probability that this will occur.

I understand that other people have different axioms; my proposed solution
is therefore to find a path that, to the extent reasonably possible,
respects the right of everyone (you included) to get what they want out of
the future. I will suggest that "I get what I want and you get what you
want" is something that we can agree on as a shared goal, and is therefore
objectively better than one party working towards "I get what I want, blow
you Jack" - this sort of thing is where the concept of morality comes from
in the first place, after all.

To claim the current level of rationality found in
> humans is the delineator for when we or any other higher beings should
> respect another being's choices/autonomy is to place yourself squarely
> in the Fukuyama/Kass camp of error.
>
> One of the main problems I personally have with being forced to live
> this or that way or do thus-and-such or undergo certain medical
> procedures is that I can't be sure the higher being has my best
> interests in mind.

That is one of the problems I have with that idea, but I also have the
problems "I value freedom in and of itself" and "Maybe the 'higher' being's
definition of 'best' isn't the same as mine"; and you haven't done anything
to address those.

But neither can puppies and neither can children,
> and that fact doesn't stop us from forcing our decisions on puppies &
> children, so why should our petulant protests stop posthumans from
> doing the same to us?

If this "might makes right" logic is to be accepted, then what humanity
should do is protest now, while it will have force: i.e. shut down the
Singularity Institute forthwith. Are you really sure this is the right
answer?

- Russell



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT