Information theory (nonfiction): Difference between revisions

From Gnomon Chronicles
Jump to navigation Jump to search
(Created page with "'''Information theory''' studies the quantification, storage, and communication of information. It was originally proposed by Claude Shannon i...")
 
No edit summary
Line 2: Line 2:


It was originally proposed by [[Claude Shannon (nonfiction)|Claude Shannon]] in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper entitled "A Mathematical Theory of Communication".
It was originally proposed by [[Claude Shannon (nonfiction)|Claude Shannon]] in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper entitled "A Mathematical Theory of Communication".
Now this theory has found applications in many other areas, including statistical inference, natural language processing, cryptography, neurobiology, the evolution and function of molecular codes, model selection in ecology, thermal physics, quantum computing, linguistics, plagiarism detection, pattern recognition, and anomaly detection.


The field is at the intersection of [[Mathematics (nonfiction)|mathematics]], statistics, [[Computer science (nonfiction)|computer science]], [[Physics (nonfiction)|physics]], neurobiology, and electrical engineering.
The field is at the intersection of [[Mathematics (nonfiction)|mathematics]], statistics, [[Computer science (nonfiction)|computer science]], [[Physics (nonfiction)|physics]], neurobiology, and electrical engineering.
Line 9: Line 7:
A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes).
A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes).


Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy.
== In the News ==
 
<gallery mode="traditional">
</gallery>


== Fiction cross-reference ==
== Fiction cross-reference ==

Revision as of 18:00, 10 September 2016

Information theory studies the quantification, storage, and communication of information.

It was originally proposed by Claude Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper entitled "A Mathematical Theory of Communication".

The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering.

A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes).

In the News

Fiction cross-reference

Nonfiction cross-reference

External links: