Changes

From Nordan Symposia
Jump to navigationJump to search
New page: '''Information theory''' is a discipline in applied mathematics involving the quantification of data with the goal of enabling as much data as possible to be reliably stored on a mediu...
'''Information theory''' is a discipline in [[applied mathematics]] involving the quantification of data with the goal of enabling as much data as possible to be reliably stored on a medium or communicated over a channel. The measure of information, known as [[information entropy]], is usually expressed by the average number of bits needed for storage or communication. For example, if a daily weather description has an entropy of 3 bits, then, over enough days, we can describe daily weather with an ''average'' of approximately 3 bits per day.

Applications of fundamental topics of information theory include [[lossless data compression]] (e.g. [[ZIP (file format)|ZIP files]]), [[lossy data compression]] (e.g. [[MP3]]s), and [[channel capacity|channel coding]] (e.g. for [[DSL]] lines). The field is at the crossroads of [[mathematics]], [[statistics]], [[computer science]], [[physics]], [[neurobiology]], and [[electrical engineering]]. Its impact has been crucial to success of the [[Voyager program|Voyager]] missions to deep space, the invention of the [[Compact disc|CD]], the feasibility of mobile phones, the development of the [[Internet]], the study of [[linguistics]] and of human perception, the understanding of [[black hole]]s, and numerous other fields.

[[Category: Formal Sciences]]

Navigation menu