'''Information theory''' is a discipline in [[applied mathematics]] involving the quantification of data with the goal of enabling as much data as possible to be reliably stored on a medium or communicated over a channel. The measure of information, known as [[information entropy]], is usually expressed by the average number of bits needed for storage or communication. For example, if a daily weather description has an entropy of 3 bits, then, over enough days, we can describe daily weather with an ''average'' of approximately 3 bits per day. | '''Information theory''' is a discipline in [[applied mathematics]] involving the quantification of data with the goal of enabling as much data as possible to be reliably stored on a medium or communicated over a channel. The measure of information, known as [[information entropy]], is usually expressed by the average number of bits needed for storage or communication. For example, if a daily weather description has an entropy of 3 bits, then, over enough days, we can describe daily weather with an ''average'' of approximately 3 bits per day. |