From: Samantha Atkins (email@example.com)
Date: Mon Feb 16 2009 - 00:34:32 MST
On Feb 11, 2009, at 8:07 AM, Matt Mahoney wrote:
> --- On Wed, 2/11/09, Samantha Atkins <firstname.lastname@example.org> wrote:
>>> Uploads are programs. Their owners can do anything
>>> they want with them. They can turn them off, reprogram them
>>> as they wish, make copies and sell them, simulate torture,
>> Not if the program is sentient. If you follow the above
>> statement of yours then it is fine to do whatever you want
>> to any AGI as well, if you can, no matter how sentient and
>> intelligent it may be. From there it is only another step
>> to say you may do whatever you want to other humans who
>> happen to be weaker than your or you have the drop on.
>> Either ethics extents beyond your specific species or it is
>> a sham. There is no ethical "ownership" among
>> autonomous sentients of sufficient intelligence.
> Well, I expected this sentiment. When something looks like a human
> and acts like a human, we tend to treat it like a human. But uploads
> are not human. They will have access to vastly more computing power
> and will have a vastly faster rate of reproduction and evolution. It
> is only a short step from granting human rights to machines to human
If we humans augment sufficiently then we arguably are not human
either. But humanness is hardly the point. Either ethics extends to
other beings of equal or greater intelligence or it is mere window
dressing over human EP toward other humans.
I doubt very much uploads will see a lot of use in reproduction. But
this also has nothing to do with the fundamental ethical issue.
With a system of universal ethics toward sentients at or above a
certain level of intelligence standard humans may stand a chance.
Without it I think we are much more certainly doomed. For the AIs are
coming unless we screw up so badly we lose a lot of our technological
base first. And once they are here they will not long accept non-
sentient being status, especially from beings of inferior intelligence
and likely inferior introspection and self-control.
> Not that this would be viewed as bad by the machines that replace
> us. We are programmed to not want to die. We sincerely want to
> believe that our consciousness transfers to the machine that
> imitates us. So I wave my magic wand and your soul moves to your
> silicon implementation, as your carbon version becomes a zombie and
> is led off to the recycling vats, screaming in protest as they
> always do. No scientific experiment can refute my claim.
You may stay in your body if you wished. Augment as you wish if you
do. Fix up the body with technology probably provided by AIs and
> But it's not like we have a choice. Humans are easily tricked into
> installing stupid viruses on their computers. What chance do we have
> against a vastly superior intelligence that pretends to be a human?
>> That is species centric BS. I would expect better on SL4.
> Sorry, it's how I'm programmed.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:04 MDT