Changes

From Nordan Symposia
Jump to navigationJump to search
6 bytes removed ,  04:47, 13 August 2007
Line 34: Line 34:  
The view of information as a message came into prominence with the publication in 1948 of an influential paper by [[Claude Shannon]], "[[A Mathematical Theory of Communication]]."  This paper provides the foundations of [[information theory]] and endows the word ''information'' not only with a technical meaning but also a measure. If the sending device is equally likely to send any one of a set of <math>N</math> messages, then the preferred measure of "the information produced when one message is chosen from the set" is the base two [[logarithm]] of <math>N</math> (This measure is called ''[[self-information]]''). In this paper, Shannon continues:
 
The view of information as a message came into prominence with the publication in 1948 of an influential paper by [[Claude Shannon]], "[[A Mathematical Theory of Communication]]."  This paper provides the foundations of [[information theory]] and endows the word ''information'' not only with a technical meaning but also a measure. If the sending device is equally likely to send any one of a set of <math>N</math> messages, then the preferred measure of "the information produced when one message is chosen from the set" is the base two [[logarithm]] of <math>N</math> (This measure is called ''[[self-information]]''). In this paper, Shannon continues:
   −
{{The [[choice]] of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more briefly [[bit]]s, a word suggested by [[John Tukey|J. W. Tukey]]. A device with two stable positions, such as a relay or a flip-flop circuit, can store one bit of information. N such devices can store N bits…<ref name = "Shannon">The Bell System Technical Journal, Vol. 27, p. 379, (July 1948).}}
+
"The [[choice]] of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more briefly [[bit]]s, a word suggested by [[John Tukey|J. W. Tukey]]. A device with two stable positions, such as a relay or a flip-flop circuit, can store one bit of information. N such devices can store N bits…"  fr. "Shannon", The Bell System Technical Journal, Vol. 27, p. 379, (July 1948).
    
A complementary way of measuring information is provided by [[algorithmic information theory]]. In brief, this measures the information content of a list of symbols based on how predictable they are, or more specifically how easy it is to [[compute]] the list through a [[program]]: the information content of a sequence is the number of bits of the shortest program that computes it. The sequence below would have a very low algorithmic information measurement since it is a very predictable pattern, and as the pattern continues the measurement would not change. Shannon information would give the same information measurement for each symbol, since they are [[statistical randomness|statistically random]], and each new symbol would increase the measurement.
 
A complementary way of measuring information is provided by [[algorithmic information theory]]. In brief, this measures the information content of a list of symbols based on how predictable they are, or more specifically how easy it is to [[compute]] the list through a [[program]]: the information content of a sequence is the number of bits of the shortest program that computes it. The sequence below would have a very low algorithmic information measurement since it is a very predictable pattern, and as the pattern continues the measurement would not change. Shannon information would give the same information measurement for each symbol, since they are [[statistical randomness|statistically random]], and each new symbol would increase the measurement.

Navigation menu