From: Krekoski Ross (email@example.com)
Date: Tue Mar 04 2008 - 13:40:27 MST
unless of course, it does, and I'm in the situation you proposed right now,
and there never was a "humanity", so to speak. But disregarding that
On Tue, Mar 4, 2008 at 8:39 PM, Krekoski Ross <firstname.lastname@example.org>
> Yes, the tree falling in the forest makes a sound.
> and no, its not preserved. since "humanity" does not equate with my
> construal of humanity.
> On Tue, Mar 4, 2008 at 2:39 PM, Mike Dougherty <email@example.com> wrote:
> > On Tue, Mar 4, 2008 at 1:49 AM, Jeff Herrlich <firstname.lastname@example.org>
> > wrote:
> > > It's probably a safe assumption that virtually all humans would prefer
> > > that humanity wasn't murdered by an amoral AGI. We can use reasonable
> > > eliminations to provide basic guidance. As a matter of reality, there will
> > > always be a minority of people who will inevitably disagree with the
> > > selected Friendly super-goal. This is inescapable. If we don't assign a
> > > specific Friendly super-goal, humanity will be destroyed by default.
> > >
> > How much of humanity would you (plural/non-specific/rhetorical)
> > recognize if you were the last representative of your kind?
> > Suppose you are cryopreserved throughout the mass uploading of
> > consciousness, or lost in space until being found by aliens (the old-school
> > plot device) When you wake up you are eased into the truth of your
> > situation by a "Friendly" (whatever that means) doctor. Your behavior is
> > taken as assumed to be the normative average for your species, and their
> > model of 21st century humanity is based on this assumption. Through
> > discussion/exhumed memories of your (presumed-dead) friends, a model of them
> > is created and you are reunited with them. (Possibly with a story
> > explaining how they were miraculously also "saved") Your reactions provide
> > feedback to correct the difference between your model of your friend and the
> > one running their simulation. (of course, with sufficiently good memory
> > scanning you wouldn't notice that this person is not exactly as you
> > remember) Given an initially convincing portrayal of your friend, over time
> > your own model of them would reflect their new/current behavior. Further
> > discussion of shared friends generates data to seed their reincarnation.
> > After the proverbial six degrees of separation, everyone you can remember
> > would be regenerated as simulated 'peers' - Surely any group advanced enough
> > to actually DO this can also infuse non-deterministic yet realistic
> > behaviors to cover the gaps in your memory and portray a convincing enough
> > representation of humanity that you may never learn that you are the only
> > "real" person from your time.
> > In this scenario, is humanity preserved?
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT