Re: Uploading with current technology (Sorry Off topic)

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Dec 09 2002 - 14:45:52 MST


Gary Miller wrote:
> Bill and my apologies to the mailing list for being off-topic...
>
> On December 9th you said:
>
> "Bill Gates may not be all that altruistic. Perhaps he is trying to
> counteract the bad publicity of the M$ antitrust case. His anti-AIDS
> campaign is wonderful, but it is interesting that it is targeted at
> India where there are many talented programmers, rather than Africa
> where there are not so many programmers."
>
> I can appreciate your cynicism in this day and age. But since it's
> inception in 1994 the Bill and Melissa Gates foundation is responsible
> for over 2.5 billion in global health program grants! Even based on
> stock values before the economic downturn this is a sizable percentage
> of his total net worth! Show me what the next 5 richest people in the
> world have given back to the world in this same time period!

Yeah... it is off-topic, actually. I've seen debates over Bill Gates, pro
and con, before - there's one going on right now on the Extropians list
under the heading "fruits of Bill Gates labor worth $50 billion", if you
want to join in. In my observation, off-topic threads can rapidly take
over a mailing list - there seem to be certain high-entropy threads, such
as capitalism vs. socialism, Mac vs. Linux vs. Windows, whether an
uploaded brain scan is "really you", et cetera, which seem to represent
the default condition of a mailing list, and into which any mailing list
"wants" to slide unless energy is continually expended to prevent that.
So this thread really *is* off-topic, and I'm ruling it out of order for
SL4. This should not be taken as indicating disagreement (or agreement)
with any of the statements made; but there really are better mailing lists.

One exception to this general principle is Friendly AI discussion, which
usually represents a high-entropy state of AI mailing lists, but has been
explicitly permitted on SL4, since it has to happen somewhere.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT