Information theory

information-theoreticinformation theoristinformationinformation theoreticSemiotic information theoryinformation theoreticalShannon information theoryShannon theorytheory of informationclassical information theory
Information theory studies the quantification, storage, and communication of information.wikipedia
943 Related Articles

Mobile phone

cell phonemobile phonesmobile
Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields.
The development of metal-oxide-semiconductor (MOS) large-scale integration (LSI) technology, information theory and cellular networking led to the development of affordable mobile communications.

Information engineering (field)

information engineeringInformationIE/Information engineering
The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, information engineering, and electrical engineering.
The components of information engineering include more theoretical fields such as machine learning, artificial intelligence, control theory, signal processing, and information theory, and more applied fields such as computer vision, natural language processing, bioinformatics, medical image computing, cheminformatics, autonomous robotics, mobile robotics, and telecommunications.

Algorithmic information theory

Algorithmic complexityalgorithmic informationdecorrelation
Important sub-fields of information theory include source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, Grey system theory and measures of information.
Algorithmic information theory (AIT) is a "merger of information theory and computer science" that concerns itself with the relationship between computation and information of computably generated objects (as opposed to stochastically generated), such as strings or any other data structure.

Forward error correction

FECchannel codingerror correcting codes
Important sub-fields of information theory include source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, Grey system theory and measures of information.
In telecommunication, information theory, and coding theory, forward error correction (FEC) or channel coding is a technique used for controlling errors in data transmission over unreliable or noisy communication channels.

A Mathematical Theory of Communication

The Mathematical Theory of Communicationcommunication theoreticMathematical Theory of Communication
It was originally proposed by Claude Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper titled "A Mathematical Theory of Communication".
The article was the founding work of the field of information theory.

Channel capacity

capacitydata capacityinformation capacity
Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Shannon's main result, the noisy-channel coding theorem showed that, in the limit of many channel uses, the rate of information that is asymptotically achievable is equal to the channel capacity, a quantity dependent merely on the statistics of the channel over which the messages are sent.
Channel capacity, in electrical engineering, computer science, and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel.

Lossy compression

lossylossy data compressioncompressed
Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL).
Basic information theory says that there is an absolute limit in reducing the size of this data.

Mutual information

Average Mutual Informationinformationalgorithmic mutual information
Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy.
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables.

Information

informativeinputinputs
Information theory studies the quantification, storage, and communication of information.
In information theory, information is taken as an ordered sequence of symbols from an alphabet, say an input alphabet χ, and an output alphabet ϒ.

Telecommunication

telecommunicationscommunicationstelecom
Information theory studies the quantification, storage, and communication of information.
There was a rapid growth of the telecommunications industry towards the end of the 20th century, driven by the development of metal-oxide-semiconductor (MOS) large-scale integration (LSI) technology, information theory, digital signal processing, and wireless communications such as cellular networks and mobile telephony.

Noisy-channel coding theorem

Shannon limitnoisy channelShannon's theorem
Shannon's main result, the noisy-channel coding theorem showed that, in the limit of many channel uses, the rate of information that is asymptotically achievable is equal to the channel capacity, a quantity dependent merely on the statistics of the channel over which the messages are sent.
In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel.

Bioinformatics

bioinformaticbioinformaticianbio-informatics
The theory has also found applications in other areas, including statistical inference, natural language processing, cryptography, neurobiology, human vision, the evolution and function of molecular codes (bioinformatics), model selection in statistics, thermal physics, quantum computing, linguistics, plagiarism detection, pattern recognition, and anomaly detection.
The algorithms in turn depend on theoretical foundations such as discrete mathematics, control theory, system theory, information theory, and statistics.

Error exponent

Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy.
In information theory, the error exponent of a channel code or source code over the block length of the code is the rate at which the error probability decays exponentially with the block length of the code.

Data compression

compressionvideo compressioncompressed
Important sub-fields of information theory include source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, Grey system theory and measures of information. It was originally proposed by Claude Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper titled "A Mathematical Theory of Communication". The rate of a source of information is related to its redundancy and how well it can be compressed, the subject of source coding.
The theoretical basis for compression is provided by information theory and, more specifically, algorithmic information theory for lossless compression and rate–distortion theory for lossy compression.

Cryptography

cryptographiccryptographercryptology
The theory has also found applications in other areas, including statistical inference, natural language processing, cryptography, neurobiology, human vision, the evolution and function of molecular codes (bioinformatics), model selection in statistics, thermal physics, quantum computing, linguistics, plagiarism detection, pattern recognition, and anomaly detection.
This fundamental principle was first explicitly stated in 1883 by Auguste Kerckhoffs and is generally called Kerckhoffs's Principle; alternatively and more bluntly, it was restated by Claude Shannon, the inventor of information theory and the fundamentals of theoretical cryptography, as Shannon's Maxim—'the enemy knows the system'.

Artificial intelligence

AIA.I.artificially intelligent
Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions.
Along with concurrent discoveries in neurobiology, information theory and cybernetics, this led researchers to consider the possibility of building an electronic brain.

Bell Labs

Bell LaboratoriesBell Telephone LaboratoriesAT&T Bell Laboratories
Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability.
Researchers working at Bell Labs are credited with the development of radio astronomy, the transistor, the laser, the photovoltaic cell, the charge-coupled device (CCD), information theory, the Unix operating system, and the programming languages C, C++, and S.

Complex system

complex systemscomplexity theorycomplexity science
Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions.
Complex systems is therefore often used as a broad term encompassing a research approach to problems in many diverse disciplines, including statistical physics, information theory, nonlinear dynamics, anthropology, computer science, meteorology, sociology, economics, psychology, and biology.

Systems science

systems scientistsystems sciencessystem science
Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions.
Systems science covers formal sciences such as complex systems, cybernetics, dynamical systems theory, information theory, linguistics or systems theory.

Ralph Hartley

HartleyRalph V. L. HartleyR. V. L. Hartley
Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as
He invented the Hartley oscillator and the Hartley transform, and contributed to the foundations of information theory.

Bit

bitsbinary digitbinary digits
A common unit of information is the bit, based on the binary logarithm.
The bit is a basic unit of information in information theory, computing, and digital communications.

Shannon's source coding theorem

source coding theoremShannon's noiseless coding theoremnoiseless coding
In information theory, Shannon's source coding theorem (or noiseless coding theorem) establishes the limits to possible data compression, and the operational meaning of the Shannon entropy.

Redundancy (information theory)

redundancyredundantdata redundancy
The rate of a source of information is related to its redundancy and how well it can be compressed, the subject of source coding.
In Information theory, redundancy measures the fractional difference between the entropy H(X) of an ensemble X, and its maximum possible value.

Mathematics

mathematicalmathmathematician
The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, information engineering, and electrical engineering.
Theoretical computer science includes computability theory, computational complexity theory, and information theory.