From: turin (firstname.lastname@example.org)
Date: Thu Dec 22 2005 - 16:25:19 MST
I am curious to know what people think about the intersection between Nikolai Kardashev's scale in his article "On the Inevitability and the Possible Structures of Supercivilizations" and the Singularity in terms of computational power and the complexity of our algorithms. I wonder if any in definitive or at least in depth work has been done as to how these two things effect each other.
I bring this up in the context of existential risk. In Kurzweil's The Age of Spiritual Machines, in the dialogue with the superintelligence from 2099, the superintelligence is very concerned with "viruses" and other mindware diseases on the net but not not so much physical diseases of the hardware, and "she" talks about a mysterious time when 90% of the processing power disappeared for no reason that could be detected inside the net.
I bring this up, because if intelligence existence evolves toward the virtual, one would not need neccesarily to control the energy of an entire galaxy such as in the Kardashev scenario. If interstellar travel is impossible, one could simple make more complex virtual worlds if femtotechnology is possible as processers would just be strings of mesons or whatnot, quantum ghosts basically.
I am a little dubious about string theory, but if the electron turns out to be, in fact, a black hole or other strange possibilities, perhaps to escape the death of the universe (provided there are other ones which one coudl get to), one would not need the energy of a galatic cluster or supercluster, only very good femtotechnology and the energy of a single star.
In other words, one could literally become so small and so smart, one could just disappear through the cracks in the floor like a cockroach.
It's just an idea. I have no clue how plausible it would be, but it does make one wonder how dated the Kardashev scale is.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT