From: Ben Goertzel (firstname.lastname@example.org)
Date: Wed Aug 17 2005 - 07:03:57 MDT
> A paper-clip maximizer would be blocked by Godel,
> since it couldn't have full self-awareness and hence
> couldn't recursively self-improve.
Godelian arguments can show, at most, that for any finite computational
system there is some limit (of theorem-proving power) beyond which
self-improvement can't lead the system.
However, this limit can be vastly larger than anything comprehensible by
humans, and certainly more than enough to cause our destruction in a manner
similar to us stepping on an ant colony...
So, Godelian arguments are obviously no use for FAI...
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT