Re: Hacking your own motivational and emotional systems, how dangerous?

From: Gwern Branwen (gwern0@gmail.com)
Date: Fri Oct 26 2007 - 16:08:02 MDT


On 2007.10.25 15:49:34 +0200, Robin Brandt <mandelum@gmail.com> scribbled 0 lines:
> What are your opinions on the issue of changing your own goal system in
> your brain or in your potential post-singularitarian supermind.
>
> How dangerous would it be if anyone had the right to change his
> motivational system how ever she wishes?
>
> Should it be regulated?
>
> I look very much forward to the possibility of replacing my Darwinian
> drives with a more beautiful, consistent, constructive and moral goal
> system!
>
> Of course you can already do this to a certain level with your own will,
> reflectivity and discipline. But you can*t reach into your own super goal
> space, of course, since it would not be a good adaption.
>
> This relates to AI reflectivity and the friendliness stability issue. But
> here the question is about multiple minds that already have a human goal
> system to begin with.
>
> This has probably been discussed a thousand times, but I have not come
> across it yet, so I thought a post may be appropriate.
> Any pointers to articles, blog posts or earlier discussions are welcome!
>
> --
> ~Robin Brandt~

It'd be quite dangerous - if only for the truism that such power to reshape yourself could be used equally for good or evil (suppose you rewired yourself to render yourself insensible to physical comfort and pleasure, to have no need for sleep? You'd be able to accomplish much more than us normal folks would, but that says nothing about whether your goal is to save the world via creating FAI, or to blow up some airplanes or something). But any mistakes would seem to probably be your last mistake.

Someone recently linked here on SL4 to a online post-Singularity set novel where the main character was the first upload; it mentioned that the upload began tinkering with itself but was careful to always test modifications in duplicates first, otherwise it would do stuff like accidentally or out of curiosity 'turn on' the pleasure center - immediately causing it to no longer care about anything or do anything at all, a living death.

--
gwern
SACS IW 5.53 Sayeret NRO Tower cybercash delay Rome bet




This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT