From: Marc Geddes (firstname.lastname@example.org)
Date: Sat Oct 02 2004 - 02:07:49 MDT
>My skepticism about the solidity of this kind of
seems to be borne out by the history of Eliezer's
thinking so far -- each
year he argues quite vehemently and convincingly for
perspective; then, a year later, he's on to a
different perspective.... I
don't think he's wrong to change his views as he
learns and grows, but I do
think he's wrong to think any kind of near-definite
Friendly AI is going to be arrived at without
experimentation with serious AGI's... Until then,
opinions will shift and
grow and retreat, as in any data-poor area of inquiry.
-- Ben Goertzel
Agreed. It looks like Eliezer has consumed a huge
amount of science books and papers, and then had a big
rush of blood to the head, fooling himself into
thinking that excellent general knowledge about a
topic can substitute for deep understanding. When
fact all that is present is a half-arsed 'pop
understanding' of many disconnected bits and peices.
We all do it of course. We're all prone to
over-estimate our own reasoning abilities. I know I
sometimes read a good popular science book, have a
rush of blood to the head and feel *wow* I really
understand deep secrets about reality. And then later
I realize the understanding is largely illusionary.
In fact that's the mark of a good science book - it's
makes you feel like you too are a genius. But of
course that's just a feeling, not reality. The
insights are coming from the author, not the reader.
"Live Free or Die, Death is not the Worst of Evils."
- Gen. John Stark
"The Universe...or nothing!"
Please visit my web-sites.
Sci-Fi and Fantasy : http://www.prometheuscrack.com
Mathematics, Mind and Matter : http://www.riemannai.org
Find local movie times and trailers on Yahoo! Movies.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:49 MDT