From: Stuart Armstrong (firstname.lastname@example.org)
Date: Sat Apr 05 2008 - 04:03:54 MDT
> Correct, many people believe that AGI is impossible within the next 50
I'm virtually one of those - I think it is very unlikely to have AGI
in the next 50 years (and always have) because I feel we don't have a
theory of consciousness (most versions I know are either mathematical
models that do not capture our current intuitive ideas, or boil down
to "we'll know it when we see it").
However, I have been convinced that the consequences of AGI are so
monumental that it is worth looking into with great seriousness,
despite the low probability. I was a "low-hanging fruit".
I don't know if this is typical....
or believe that because it's not certain to happen, it
> shouldn't be planned for. My belief is that people who think this way
> are not "low-hanging fruit", most of them will always find an excuse
> to personally ignore the problem, and so casting the net wider (that
> is, to people who have not never heard the talking points) should be a
> higher priority than casting the net deeper (debating people at length
> who have already demonstrated an initial inability or an unwillingness
> to confront the problem.) All IMHO of course, if the more veteran FAI
> advocates tell me I'm wrong, based on their greater advocacy
> experience, then I'll believe them.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT