**From:** Peter de Blanc (*peter@spaceandgames.com*)

**Date:** Tue Jan 06 2009 - 19:39:46 MST

**Next message:**Rolf Nelson: "Re: [sl4] Rolf's gambit revisited"**Previous message:**Benja Fallenstein: "Re: [sl4] Rolf's gambit revisited"**In reply to:**Benja Fallenstein: "Re: [sl4] Rolf's gambit revisited"**Next in thread:**Benja Fallenstein: "Re: [sl4] Rolf's gambit revisited"**Reply:**Benja Fallenstein: "Re: [sl4] Rolf's gambit revisited"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ] [ attachment ]

Benja Fallenstein wrote:

*> Question: Can a Turing
*

*> machine X, based on any criteria at all, form a computable probability
*

*> distribution that assigns probability >1/n to a specific hypothesis Y
*

*> such that K(Y) > K(X) + K(n)? The answer isn't clear to me, although
*

*> my intuition is that it is "no."
*

That's an interesting question! The answer is yes. For example, if n =

3^^^^3 and Y is any fairly (but not ludicrously) complex hypothesis.

To get an answer of "no," I think you should ask:

Can a Turing machine X form a computable probability distribution that

assigns probability >p to a specific hypothesis Y such that K(Y) > K(X)

- log(p) + C? Then if you choose the right constant C, the answer is "no."

This is because, using entropy coding, we can specify Y with -log(p)

bits given X. We need K(X) bits to specify X, and C bits to specify an

entropy decoder. So K(Y) <= K(X) - log(p) + C.

**Next message:**Rolf Nelson: "Re: [sl4] Rolf's gambit revisited"**Previous message:**Benja Fallenstein: "Re: [sl4] Rolf's gambit revisited"**In reply to:**Benja Fallenstein: "Re: [sl4] Rolf's gambit revisited"**Next in thread:**Benja Fallenstein: "Re: [sl4] Rolf's gambit revisited"**Reply:**Benja Fallenstein: "Re: [sl4] Rolf's gambit revisited"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ] [ attachment ]

*
This archive was generated by hypermail 2.1.5
: Wed Jul 17 2013 - 04:01:03 MDT
*