From: Lee Corbin (email@example.com)
Date: Wed Mar 26 2003 - 01:25:56 MST
> firstname.lastname@example.org writes:
> > By now, we should have seen a large portion
> > of the visible universe converted to computronium.
> I'm hesitant to be so terse about this, but it has always
> been my expectation that perhaps there already has been a
> singularity somewhere/when else and its results are simply
> beyond our comprehension
Hmm. I would call this the "side-effect hypothesis", namely
that we may exist between the cracks of tremendous processes
busily utilizing almost all the total resources. (If the big
S didn't happen here, or to us, then I don't see much chance
of it running simulations of us.)
In this way I suppose that a vastly greater civilization could
have already swept through whatever region of space we occupied.
But I think that I would also then be forced to suppose that all
our compute potential, assuming we're real at all, to be of
negligible utility to that entity/civilization.
However! This would violate the "Lust to Complete Expansion"
that I was describing before, and equivalently, go counter to
Eliezer's persuasive contention that there are no conserved
quantities (that would inhibit expansion). Our instincts to
suppose that either attention or software cannot be readily
and freely multiplied must be a legacy of our <=H heritage.
Next, you address the moral dimension
> [a singularity somewhere] such that any moral questions we
> have for it are resolved quickly and easily for an omniscient
> being (like a universe-encompassing post-singularity mind)
> but not at all for us.
I think that what you are saying is that it can resolve
any moral question that we can conceive of, whether or
not *we* could resolve it.
Well, this sounds to me like it's an attempted crossing
of the is/ought barrier. How should we know what's
important to an SI? If we are "living between the cracks"
then, as those unforgettable processes in "A Fire Beyond
the Sky" whispered,
"We should not be."
"Talking like this?"
"Talking at all."
But we *do* know what kind of SI we'd like to see happen, and
I thought Eliezer addressed the moral component of that case
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT