Changes

From Nordan Symposia
Jump to navigationJump to search
m
no edit summary
Line 1: Line 1: −
[[Image:lighterstill.jpg]]
+
[[Image:lighterstill.jpg]] [[Image:Teslatower_2.jpg|right|frame]]
 +
 
    
'''Information theory''' is a discipline in [[applied mathematics]] involving the quantification of data with the goal of enabling as much data as possible to be reliably stored on a medium or communicated over a channel.  The measure of information, known as [[information entropy]], is usually expressed by the average number of bits needed for storage or communication.  For example, if a daily weather description has an entropy of 3 bits, then, over enough days, we can describe daily weather with an ''average'' of approximately 3 bits per day.
 
'''Information theory''' is a discipline in [[applied mathematics]] involving the quantification of data with the goal of enabling as much data as possible to be reliably stored on a medium or communicated over a channel.  The measure of information, known as [[information entropy]], is usually expressed by the average number of bits needed for storage or communication.  For example, if a daily weather description has an entropy of 3 bits, then, over enough days, we can describe daily weather with an ''average'' of approximately 3 bits per day.
 +
    
Applications of fundamental topics of information theory include [[lossless data compression]] (e.g. [[ZIP (file format)|ZIP files]]), [[lossy data compression]] (e.g. [[MP3]]s), and [[channel capacity|channel coding]] (e.g. for [[DSL]] lines).  The field is at the crossroads of [[mathematics]], [[statistics]], [[computer science]], [[physics]], [[neurobiology]], and [[electrical engineering]]. Its impact has been crucial to success of the [[Voyager program|Voyager]] missions to deep space, the invention of the [[Compact disc|CD]], the feasibility of mobile phones, the development of the [[Internet]], the study of [[linguistics]] and of human perception, the understanding of [[black hole]]s, and numerous other fields.
 
Applications of fundamental topics of information theory include [[lossless data compression]] (e.g. [[ZIP (file format)|ZIP files]]), [[lossy data compression]] (e.g. [[MP3]]s), and [[channel capacity|channel coding]] (e.g. for [[DSL]] lines).  The field is at the crossroads of [[mathematics]], [[statistics]], [[computer science]], [[physics]], [[neurobiology]], and [[electrical engineering]]. Its impact has been crucial to success of the [[Voyager program|Voyager]] missions to deep space, the invention of the [[Compact disc|CD]], the feasibility of mobile phones, the development of the [[Internet]], the study of [[linguistics]] and of human perception, the understanding of [[black hole]]s, and numerous other fields.
   
== Overview ==
 
== Overview ==
 
The main concepts of information theory can be grasped by considering the most widespread means of human communication: language.  Two important aspects of a good language are as follows:  First, the most common words (e.g., "a," "the," "I") should be shorter than less common words (e.g., "benefit," "generation," "mediocre"), so that sentences will not be too long.  Such a tradeoff in word length is analogous to [[data compression]] and is the essential aspect of [[source coding]].  Second, if part of a sentence is unheard or misheard due to noise—e.g., a passing car—the listener should still be able to glean the meaning of the underlying message.  Such robustness is as essential for an electronic communication system as it is for a language; properly building such robustness into communications is done by [[Channel capacity|channel coding]].  Source coding and channel coding are the fundamental concerns of information theory.
 
The main concepts of information theory can be grasped by considering the most widespread means of human communication: language.  Two important aspects of a good language are as follows:  First, the most common words (e.g., "a," "the," "I") should be shorter than less common words (e.g., "benefit," "generation," "mediocre"), so that sentences will not be too long.  Such a tradeoff in word length is analogous to [[data compression]] and is the essential aspect of [[source coding]].  Second, if part of a sentence is unheard or misheard due to noise—e.g., a passing car—the listener should still be able to glean the meaning of the underlying message.  Such robustness is as essential for an electronic communication system as it is for a language; properly building such robustness into communications is done by [[Channel capacity|channel coding]].  Source coding and channel coding are the fundamental concerns of information theory.
Line 19: Line 20:     
==Historical background==
 
==Historical background==
{{main|History of information theory}}
      
The landmark event that established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of [[Claude E. Shannon]]'s classic paper "[[A Mathematical Theory of Communication]]" in the ''[[Bell System Technical Journal]]'' in July and October of 1948.
 
The landmark event that established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of [[Claude E. Shannon]]'s classic paper "[[A Mathematical Theory of Communication]]" in the ''[[Bell System Technical Journal]]'' in July and October of 1948.

Navigation menu