Information theory (nonfiction): Difference between revisions
No edit summary |
No edit summary |
||
Line 10: | Line 10: | ||
<gallery mode="traditional"> | <gallery mode="traditional"> | ||
File:Maxwell's_demon.svg|link=Maxwell's demon (nonfiction)|[[Maxwell's demon (nonfiction)|Maxwell's demon]] not so bad once you get to know [[Mathematics|the math]], says [[Léon Brillouin (nonfiction)|Brillouin]]. | |||
</gallery> | </gallery> | ||
Latest revision as of 18:32, 10 September 2016
Information theory studies the quantification, storage, and communication of information.
It was originally proposed by Claude Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper entitled "A Mathematical Theory of Communication".
The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering.
A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes).
In the News
Maxwell's demon not so bad once you get to know the math, says Brillouin.
Fiction cross-reference
Nonfiction cross-reference
External links:
- Information theory @ Wikipedia