Re: Signaling after a singularity

From: Stuart Armstrong (dragondreaming@googlemail.com)
Date: Wed May 07 2008 - 03:53:05 MDT


> I would have to disagree. Complete understanding does not necessarily
> imply drastic change in behavior. Humans have always overridden
> understanding and logic with emotional response. Why would my perfect
> understanding of you mean there was no longer a perceived need for
> competition, or creative expression?

Competition would become stunted, because it would be obvious who
would "win". People might still create (for themselves), but
conversations might go along the lines of:
A: I will write a better poem than any of yours!
B: No you won't. The AI claims, taking humanity's aethetic values,
that I'm a better poet.
A: Yes, but if I practise long enough...
B: According to the AI, you will be better than me if you practice
7hours a day, for 5 weeks. However, if I practice 5 hours a week, I
will maintain my edge indefinetly. The AI claims I'm lazier than you,
but thinks I could stick to that program as well as you can stick to
your.
A: So no real point in writing a poem, then?
B: Or practising.

It might not end up like that, but without the possibility of real
competition, (because you acheive exactly what everyone knows you will
acheive), the world would be very different.

Of course if things end up being probabilistic, rather than
determinate, then it may be very different; people still take dumb
risks, after all.

Stuart



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT