From: Dale Johnstone (DaleJohnstone@email.com)
Date: Sun Apr 08 2001 - 07:47:32 MDT
Brian Atkins wrote:
>To the list I ask: if for instance Wired magazine wanted to do a large
>article about SIAI in the near future, complete with cover image of
>Eliezer with a quote "This 21 year old cognitive scientist is building an
>AI that will end the world as we know it" do you think that would accelerate
>our plans or hurt them? Assume that besides talking about Eliezer and our
>plans, it also presents FAI as the answer to Bill Joy's AI concerns.
As much as I'm in favour of openness, you're not going to get an informed 'debate' in the media. Frankly, you're not going to get any kind of debate much further outside of lists like this one. I would love to live in a society whereby people could raise issues of the day, and everyone would have their say, and we'd come to some sort of sensible conclusion and do the right thing - but sadly that's not how it happens. An article like that (if taken seriously) would lead to a polarization of opinion, most of which would be against us, and possible exposure to extremist groups. In general, AI is not perceived as harmful or in anyway threatening and I'd like it to stay that way.
>That probably is not how we would want to come across in an important
>article, but it might be how it would be presented by a magazine like Wired.
Of course, they want to sell magazines.
>You have to consider on the plus side that more people find out about us, and
>that we would likely be able to get more funding. On the downside, it might
>start a public backlash of unknown proportions. Or it might not; most people
>might do what they did re: Bill Joy and either say it won't happen, or that
>there is nothing they can do about AI so at least we are the best ones to
You don't need a great deal of funding to do this, it's mainly about having the right ideas. A calm, open, and unpolarized environment is more likely to lead to people making the right decisions and ultimately a successful singularity.
The only essential publicity needed right now is for getting all potential seed AI writers to understand the importance of incorporating a Friendliness attractor into their designs. Eventually somebody, somewhere will do it - I'd like that person to be fully aware of the issues. It's very easy to underestimate the long-term consequences when engrossed in the immediate problems of AI. Eliezer's Friendliness paper does a great job in raising awareness, but I'd like to see more from other sources on the issue. This is something practical other list members can do *right now*: If you have a web site likely to be of interest to AI enthusiasts - write something about Friendliness and/or the Singularity and link to other sites discussing Friendliness - Eli's being a definite. You might then also like to submit your site to the Open Directory project at www.dmoz.org. (Eliezer edits the singularity branch there <grin>). Let's make sure no AI programmer can surf the web without coming!
As for Mr Joy's article, well, the more you talk about it, as we're doing right now, the more it hangs around and the more we validate it as an attack. Don't feed it and it won't grow. Just write objectively, not from the pro- or anti- camp.
The Singularity is just too far out for most people to engage their fears in yet, let alone debate, let alone request funding for. You might get away with publicity for those reasons, but do you want to risk forcing us underground if they take you seriously? I need funding myself, but right now it's possible to make good progress without P.R. stunts. I would prefer to keep it that way for as long as possible, if only to avoid causing problems for other groups.
If however the issue *does* explode publicly, then by all memes defend yourself.
BTW Congratulations on your tax-exempt status. :)
This archive was generated by hypermail 2.1.5 : Wed May 22 2013 - 04:00:20 MDT