From: Shane Legg (email@example.com)
Date: Mon Feb 11 2008 - 07:17:15 MST
On 10/02/2008, Rolf Nelson <firstname.lastname@example.org> wrote:
my own estimate is that SIAI directly saves mankind at about 200:1 odds.
If this is the case, then it seems to suggest that SIAI should be less
on building their own AGI, and more focused on the far more likely scenario
that somebody else builds the first AGI, and SIAI tries to influence and
the situation towards a good outcome.
This archive was generated by hypermail 2.1.5 : Tue Jun 18 2013 - 04:00:57 MDT