A complementary way of measuring information is provided by [[algorithmic information theory]]. In brief, this measures the information content of a list of symbols based on how predictable they are, or more specifically how easy it is to [[compute]] the list through a [[program]]: the information content of a sequence is the number of bits of the shortest program that computes it. The sequence below would have a very low algorithmic information measurement since it is a very predictable pattern, and as the pattern continues the measurement would not change. Shannon information would give the same information measurement for each symbol, since they are [[statistical randomness|statistically random]], and each new symbol would increase the measurement. | A complementary way of measuring information is provided by [[algorithmic information theory]]. In brief, this measures the information content of a list of symbols based on how predictable they are, or more specifically how easy it is to [[compute]] the list through a [[program]]: the information content of a sequence is the number of bits of the shortest program that computes it. The sequence below would have a very low algorithmic information measurement since it is a very predictable pattern, and as the pattern continues the measurement would not change. Shannon information would give the same information measurement for each symbol, since they are [[statistical randomness|statistically random]], and each new symbol would increase the measurement. |