From: Lee Corbin (email@example.com)
Date: Sun Mar 23 2008 - 01:02:17 MDT
> [Lee wrote]
>> Surely this emulation "really is" your wife---it's just that she's been
>> uploaded. That's best for her, surely you concede, than being
>> replaced by a fiendish actor who is really a different person
> But how do I know the AI chip really "cares" about her values, or
> just behaves as if it cares? Isn't this the same situation as the actress?
Not quite, because in the situation described, we are assuming that
the AI has no particular motive to do such a thing. And, more importantly,
it would be twice as hard to accomplish. If you want to present someone
to the world, i.e. simulate but not emulate them, then the simplest thing
to do is just get them emulated in some algorithm. Why add on the baggage
of an 'actor' one level higher up?
That possibility did occur, however, in another context. I suggested that
a benevolent AI may simulate someone if not enough information is
available to truly emulate that person, just as a kindness to friends and
relatives who may miss her. She could even claim to have a bit of
amnesia, the AI helping the actor-software whenever it could. That's
the only motive I can think of for why an AI would really want to just
simulate someone if for less cost he or she could be emulated. Moreover,
we want and hope and pray that the AIs which take over are either us,
or as benevolent as we are.
> If a simulation is just good enough to fool me, and emulation is perfectly
> computationally equivalent, then how would I know the difference?
Yes, there isn't any way to know.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT