http://arxiv.org/abs/1303.6738 ]]>

http://www.mdpi.com/1099-4300/13/3/595/pdf

“Kolmogorov complexity and Shannon entropy are conceptually different, as the former is based on

the length of programs and the later in probability distributions. However, for any recursive probability

distribution (i.e., distributions that are computable by a Turing machine), the expected value of the

Kolmogorov complexity equals the Shannon entropy, up to a constant term depending only on the

distribution (see [1]).”

I was mixing it up a bit in this essay, but more or less assuming this was true. I guess in this case, it is more or less true, as that pattern is certainly computable by Turing machines.

]]>