From: Phillip Huggan (email@example.com)
Date: Sun Apr 23 2006 - 22:53:27 MDT
It's not fair to call AGI the only Singularity. At the end AGI is just another engineering
technology; you still have to specify some goal for it to serve. It just so happens that a good chunk of our global industrial base is devoted towards both improving computer hardware and programming bigger software, so this powerful engineering technology may appear "ahead of its time". But even post-AGI there should still be progress. A time-machine would give you direct access to the machine's whole future light-cone volume instantly. The ability to manufacture black-holes, travel near C or even harvest antimatter may be a bigger accelerant post-AGI, than AGI would be now. In this context a technology like MNT or indeed any exponential manufacturing method, can be seen as Singularity-ish.
AGI spacecraft might not blow apart but their time-machines or their particle accelerators might. This is why I think humans should have the ultimate say on any AGI action. If ve is just allowed to force vis engineering concerns, ve will eventually come across problems that may be as intractable (from the PoV of us meatty fragile humans that will suffer extinction from any AGI engineering-gone-wrong) to it as our societal problems are to us. With AGI you are attempting to take on all human extinction risks in one giant bite. I'm not saying don't do it, I'm just saying don't hold achieving AGI to be the be all and the end all (even though it might be), especially if a superior pathway presents itself in the decades ahead. Don't turn on SkyNet if we got a good thing going in the decades ahead.
Richard Loosemore <firstname.lastname@example.org> wrote:
This is an important thread.
I think that Kurzweil's tendency to emphasize exponential curves of
general technology is really a nuisance. I don't care how many curves
he finds, I think that without AGI we simply cannot get more than a
certain rate of new technological development. Frankly, I think it will
hit a plateau. (I even have my suspicions that we already did hit the
And as for AGI, this for me is *the* definition of the Singularity.
None of the rest is very important. Without it, I see so many
limitations on the ability of human engineers to handle the complexity
of ordinary technology that I do not believe there will be any dramatic
advances in the near future. I think spacecraft will continue to blow
up because of single-line errors in their code, and the IRS will fail
again and again to upgrade their computer software.
Blab-away for as little as 1¢/min. Make PC-to-Phone Calls using Yahoo! Messenger with Voice.
This archive was generated by hypermail 2.1.5 : Sat May 25 2013 - 04:01:00 MDT