Information theory

information theoristinformation-theoreticinformationinformation theoreticinformation theoreticaltheory of informationclassical information theoryInformation theoreticallyclassical (non-quantum) information theorycommunications
Information theory studies the quantification, storage, and communication of information.wikipedia
915 Related Articles

Channel capacity

capacitydata capacityinformation capacity
Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Shannon's main result, the noisy-channel coding theorem showed that, in the limit of many channel uses, the rate of information that is asymptotically achievable is equal to the channel capacity, a quantity dependent merely on the statistics of the channel over which the messages are sent.
Channel capacity, in electrical engineering, computer science and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel.

Lossy compression

lossylossy data compressioncompressed
Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL).
Basic information theory says that there is an absolute limit in reducing the size of this data.

Mutual information

informationalgorithmic mutual informationan analogue of mutual information for Kolmogorov complexity
Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy.
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables.

Error exponent

Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy.
In information theory, the error exponent of a channel code or source code over the block length of the code is the logarithm of the error probability.

Information

informativeinputinputs
Information theory studies the quantification, storage, and communication of information.
In information theory, information is taken as an ordered sequence of symbols from an alphabet, say an input alphabet χ, and an output alphabet ϒ.

A Mathematical Theory of Communication

communication theoreticrelay and switch circuitsShannon (1948)
It was originally proposed by Claude E. Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper entitled "A Mathematical Theory of Communication". The landmark event that established the discipline of information theory and brought it to immediate worldwide attention was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.
The article was the founding work of the field of information theory.

Information engineering (field)

information engineeringInformationIE/Information engineering
The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, information engineering, and electrical engineering.
The components of information engineering include more theoretical fields such as machine learning, artificial intelligence, control theory, signal processing, and information theory, and more applied fields such as computer vision, natural language processing, bioinformatics, medical image computing, cheminformatics, autonomous robotics, mobile robotics, and telecommunications.

Algorithmic information theory

Algorithmic complexityalgorithmic informationdecorrelation
Important sub-fields of information theory include source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, and measures of information.
Algorithmic information theory is a subfield of information theory and computer science that concerns itself with the relationship between computation and information.

Noisy-channel coding theorem

noisy channelnoisy channel coding theoremchannel coding theorem
Shannon's main result, the noisy-channel coding theorem showed that, in the limit of many channel uses, the rate of information that is asymptotically achievable is equal to the channel capacity, a quantity dependent merely on the statistics of the channel over which the messages are sent.
In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel.

Forward error correction

FECchannel codingerror correcting codes
Important sub-fields of information theory include source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, and measures of information.
In telecommunication, information theory, and coding theory, forward error correction (FEC) or channel coding is a technique used for controlling errors in data transmission over unreliable or noisy communication channels.

Complex system

complex systemscomplexity theorycomplexity science
Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions.
Complex systems is therefore often used as a broad term encompassing a research approach to problems in many diverse disciplines, including statistical physics, information theory, nonlinear dynamics, anthropology, computer science, meteorology, sociology, economics, psychology, and biology.

Artificial intelligence

AIartificially intelligentA.I.
Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions.
Along with concurrent discoveries in neurobiology, information theory and cybernetics, this led researchers to consider the possibility of building an electronic brain.

Bioinformatics

bioinformaticbioinformaticiangenome browser
The theory has also found applications in other areas, including statistical inference, natural language processing, cryptography, neurobiology, human vision, the evolution and function of molecular codes (bioinformatics), model selection in statistics, thermal physics, quantum computing, linguistics, plagiarism detection, pattern recognition, and anomaly detection.
The algorithms in turn depend on theoretical foundations such as discrete mathematics, control theory, system theory, information theory, and statistics.

Bell Labs

Bell Telephone LaboratoriesBell LaboratoriesAT&T Bell Laboratories
Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability.
Researchers working at Bell Labs are credited with the development of radio astronomy, the transistor, the laser, the charge-coupled device (CCD), information theory, the Unix operating system, and the programming languages C, C++, and S.

Systems science

systems scientistsystems sciencessystem science
Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions.
Systems science covers formal sciences such as complex systems, cybernetics, dynamical systems theory, information theory, linguistics or systems theory.

Cryptography

cryptographiccryptographercryptology
The theory has also found applications in other areas, including statistical inference, natural language processing, cryptography, neurobiology, human vision, the evolution and function of molecular codes (bioinformatics), model selection in statistics, thermal physics, quantum computing, linguistics, plagiarism detection, pattern recognition, and anomaly detection.
This fundamental principle was first explicitly stated in 1883 by Auguste Kerckhoffs and is generally called Kerckhoffs's Principle; alternatively and more bluntly, it was restated by Claude Shannon, the inventor of information theory and the fundamentals of theoretical cryptography, as Shannon's Maxim—'the enemy knows the system'.

Ralph Hartley

HartleyR. V. L. HartleyHartley, Ralph
Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as
He invented the Hartley oscillator and the Hartley transform, and contributed to the foundations of information theory.

Data compression

compressionvideo compressioncompressed
It was originally proposed by Claude E. Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper entitled "A Mathematical Theory of Communication". Important sub-fields of information theory include source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, and measures of information.
The theoretical background of compression is provided by information theory (which is closely related to algorithmic information theory) for lossless compression and rate–distortion theory for lossy compression.

Redundancy (information theory)

redundancyredundantredundant information
the information entropy and redundancy of a source, and its relevance through the source coding theorem;
In Information theory, redundancy measures the fractional difference between the entropy of an ensemble, and its maximum possible value.

Additive white Gaussian noise

additive noiseAWGNadditive
the practical result of the Shannon–Hartley law for the channel capacity of a Gaussian channel; as well as
Additive white Gaussian noise (AWGN) is a basic noise model used in Information theory to mimic the effect of many random processes that occur in nature.

Shannon's source coding theorem

source coding theoremnoiseless coding
the information entropy and redundancy of a source, and its relevance through the source coding theorem;
In information theory, Shannon's source coding theorem (or noiseless coding theorem) establishes the limits to possible data compression, and the operational meaning of the Shannon entropy.

Bell System Technical Journal

Bell Syst. Tech. J.
The landmark event that established the discipline of information theory and brought it to immediate worldwide attention was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.
Claude Shannon's paper "A Mathematical Theory of Communication", which founded the field of information theory, was published as two-part article in July and October issue of 1948.

Machine learning

learningmachine-learningstatistical learning
Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions.
Due to its generality, the field is studied in many other disciplines, such as game theory, control theory, operations research, information theory, simulation-based optimization, multi-agent systems, swarm intelligence, statistics and genetic algorithms.

Bit

binary digitbitsbinary digits
the bit—a new way of seeing the most fundamental unit of information.
In information theory, one bit is typically defined as the information entropy of a binary random variable that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable becomes known.

Entropy in thermodynamics and information theory

close similarityconnection of "disorder" to thermodynamic entropyentropy
Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored in Entropy in thermodynamics and information theory.
The defining expression for entropy in the theory of information established by Claude E. Shannon in 1948 is of the form: