Re: Friendliness and blank-slate goal bootstrap

From: Metaqualia (metaqualia@mynichi.com)
Date: Mon Oct 06 2003 - 23:31:08 MDT


> Since natural childbirth is so painful for both the mother and child,
> and C-section is available to anyone who wants it, why is natural
> childbirth still practiced? (If the mother gains somehow from the
> experience, is it still justified in terms of the pain caused to the
> child?)

C-section and natural childbirth both lead to the creation of a sentient
biological being according to an evolved design, in a resource-restricted
environment, with qualia for physical pain, rejection, disappointment,
abandonment, resentment, and so forth. Existence of biologically evolved
entities is _the_ pain machine, _the_ ultimate evil (or if you don't
believe, at least ponder: "an evil"), even not considering the fact that
this new entity is itself programmed to reproduce and create an
exponentially increasing number of pain experiencing beings.

[moms and dads: so what _should_ one have done if you wanted to be Friendly
according to the definition of objective morality we are considering? Should
have waited until nanotech/the golden era/the singularity, then think about
producing another sentient being (unless it's a Friendly AI or you
engineered your baby to have IQ 300 and possibly create such AI). ]

PS: there is some degree of discomfort caused in two beings who are
programmed to reproduce, when they decide that they won't, for the sake of
some higher morality. They "lose" something by not being able to satisfy
their biological imperative. The lesser evil is adopting a child. The
greater evil is giving birth to a human. The zero-evil scenario is one in
which either children never experience anguish OR the need for reproduction
is eliminated.

>What does this have to do with anything?
>-Robin, whose sister is a midwife.

Good question, I was just about to justify this whole discussion once more,
since we are at one of the capillary terminations of its implications and
it's no longer obvious where it came from.

Well we are trying to weed out possible inconsistencies with an objective
theory of morality based on the hypothesis that unpleasant qualia can be
thought of as the universal definition of evil. If we can formulate a
universal theory of morality based on qualia (such as: ouch!) and no other
universal theory of morality based on anything else, then there is a
possibility that qualia are required in order to create a universal theory
of morality. And in this case it would affect the design of a Friendly AI
because we would steer it in some way toward first figuring out or
experiencing qualia, THEN deciding about the fate of the universe.

>Other messages have expanded this into distinctions between pain,
>suffering, and fear. I'm trying to envision an FAI's difficulty in
>discerning where intervention is required/desired.

It would need to be perfectly empathic: by simulating someone's neural
processes, be able to feel what they are feeling. This probably requires
figuring out qualia engineering principles.

curzio



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT