Biowar vs. AI as existential threat

From: Philip Goetz (philgoetz@gmail.com)
Date: Mon Feb 13 2006 - 15:18:13 MST


Numerous people on this list are truly alarmed at the prospects of AIs
wiping out humanity. It seems to me that it would be easier to
develop existential-threat diseases than to develop AI. For instance,
it is already known how to make a 100%-lethal
vaccine-and-antibiotic-and-natural-resistance-resistant smallpox. If
someone were to modify it further to have a latent period - say, of
three years after infection - we would be looking at something that
could kill pretty much everyone on the planet.

Humans are prone to imagine utopian social orders, and to believe that
they know the one true way how to order society so as to cure all its
ills. Such schemes have always failed. I think it is inevitable that
some new world-reformer will decide that their favorite social order
will work this time, if they can only start from a clean slate, with
no other competing social orders. This person, not the military
invader or the religious zealot, is the most dangerous; unlike our
other homicidal nutcases, they have an incentive to kill EVERYONE on
the planet except a selected few.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT