Once again: James Gleick describes the “Kolmogorov complexity” of an object as “the size, in bits, of the shortest algorithm needed to generate it”. In other words, a measure of how much information is stored.
Gleick outlines, in quite a bit of detail, that the amount of difference (surprise) in something is equal to the amount of information it holds, be it a number or a sentence or a thing. The number 1,000,000,000,000,000,000 has little information even though it is very large, since the zeroes are predictable, whereas the number 14,957 has little discernible pattern and therefore holds more information. Similarly, a string of letters like “pile” has less information, since we can guess there might be an “e” at the end.
More interestingly, Gleick explains that Kolmogorov also saw this as a measure of randomness, that the more information something contained, the more random it was, and vice-versa.
Via: “The Information”, pg 337.