From: Phillip Huggan (firstname.lastname@example.org)
Date: Fri Dec 09 2005 - 13:18:52 MST
Disclosing the location of the project is a bad idea. But brainstorming the types of its advice you'd implement and the types you wouldn't (like modifying its own code) could be done as an open collaboration. The researchers would just have to be able to follow some agreed upon protocol. Of course there are bound to be grey areas to implementation in what Oracle spits out, so not everything could be ideally planned ahead of time and the real-time onsite researchers would have tread carefully.
Michael Wilson <email@example.com> wrote: <SNIP> I must
reluctently classify Nick Bostrom's proposal to make an
Oracle generally available (or at least, publically known
and available to experts) as hopelessly naive. Clearly
there is vast potential for misuse and abuse that would
be unavoidable if publically known, at least in the short
space of time before some fool asks one how to build a
seed AI that will help them with their personal goals. It
does seem likely to me that an Orcale built and used in
secret, by sufficiently moral and cautious researchers,
would be a net reduction in risk for an FAI project.
Find Great Deals on Holiday Gifts at Yahoo! Shopping
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT