Category:Information Theory

From Nordan Symposia
Jump to navigationJump to search

Lighterstill.jpg

Teslatowers.jpg

Information theory is a discipline in applied mathematics involving the quantification of data with the goal of enabling as much data as possible to be reliably stored on a medium or communicated over a channel. The measure of information, known as information entropy, is usually expressed by the average number of bits needed for storage or communication. For example, if a daily weather description has an entropy of 3 bits, then, over enough days, we can describe daily weather with an average of approximately 3 bits per day.

Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s), and channel coding (e.g. for DSL lines). The field is at the crossroads of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering. Its impact has been crucial to success of the Voyager missions to deep space, the invention of the CD, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields.

Pages in category "Information Theory"

The following 2 pages are in this category, out of 2 total.