From: Doug Keenan (firstname.lastname@example.org)
Date: Tue Jul 16 2002 - 18:43:43 MDT
> I asserted that it was going to be necessary to teach a baby AGI an
> approximation to some particular human moral system. Eli seemed
> to disagree with this, arguing that the baby AGI should be taught to
> treat all human moral systems equally (or something like that).
Is that how you understand his position? My take on his take of the
baby's journey includes a large enclosed circular playpen. The floor
of the pen is not flat, with a slight decline more or less equally from
any direction towards the center. Moving towards the center frees
energy (and could conceivably power the take-off?).
How I understand Eli's position is that any competent baby will arrive
at the center with no undue respect to original orientation or position
in the pen. (This includes one with your orientation I suppose, so I
can't much disagree with your approach either.) To treat human moral
systems equally requires enough interest from each to place the baby
in the playpen but, along with a healthy crawling baby, not much more.
> I don't think this makes sense, since there are some human moral
> systems that say AGI and uploading are evil, others that say AGI
> is a waste of resources, etc.
"Some human moral systems" seriously close to placing a baby in
the pen? Sounds like the latter are celibate. The former? Hmmm.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT