Log in

Kolmogorov complexity

From http://en.wikipedia.org/wiki/Kolmogorov_complexity

In algorithmic information theory (a subfield of computer science), the Kolmogorov complexity of an object, such as a piece of text, is a measure of the computational resources needed to specify the object. It is named after the Andrey Kolmogorov who first published on the subject in 1963.[1][2]

Kolmogorov complexity is also known as "descriptive complexity" (not to be confused with descriptive complexity theory), Kolmogorov–Chaitin complexity, algorithmic entropy, or program-size complexity.

Relation to entropy

It can be shown[11] that for the output of Markov information sources, Kolmogorov complexity is related to the entropy of the information source. More precisely, the Kolmogorov complexity of the output of a Markov information source, normalized by the length of the output, converges almost surely (as the length of the output goes to infinity) to the entropy of the source