# Redundancy (information theory)

redundancyredundantredundant informationdata redundancyinformation redundancyredundancy in information theoryredundant bitsredundantlystatistical redundancy
In Information theory, redundancy measures the fractional difference between the entropy of an ensemble, and its maximum possible value.wikipedia
79 Related Articles

### Data compression

compressionvideo compressioncompressed
Data compression is a way to reduce or eliminate unwanted redundancy, while checksums are a way of adding desired redundancy for purposes of error detection when communicating over a noisy channel of limited capacity.
Lossless compression reduces bits by identifying and eliminating statistical redundancy.

### Error detection and correction

error correctionerror detectionerror-correction
Data compression is a way to reduce or eliminate unwanted redundancy, while checksums are a way of adding desired redundancy for purposes of error detection when communicating over a noisy channel of limited capacity.
All error-detection and correction schemes add some redundancy (i.e., some extra data) to a message, which receivers can use to check consistency of the delivered message, and to recover data that has been determined to be corrupted.

### Information theory

information theoristinformation-theoreticinformation
In Information theory, redundancy measures the fractional difference between the entropy of an ensemble, and its maximum possible value.
the information entropy and redundancy of a source, and its relevance through the source coding theorem;

### Entropy (information theory)

entropyinformation entropyShannon entropy
In describing the redundancy of raw data, the rate of a source of information is the average entropy per symbol.

### Channel capacity

capacitydata capacityinformation capacity
Data compression is a way to reduce or eliminate unwanted redundancy, while checksums are a way of adding desired redundancy for purposes of error detection when communicating over a noisy channel of limited capacity.
Redundancy

### Mutual information

informationalgorithmic mutual informationan analogue of mutual information for Kolmogorov complexity
A measure of redundancy between two variables is the mutual information or a normalized variant.
In some cases a symmetric measure may be desired, such as the following redundancy measure:

### Checksum

checksumscheck sumcheck-sum
Data compression is a way to reduce or eliminate unwanted redundancy, while checksums are a way of adding desired redundancy for purposes of error detection when communicating over a noisy channel of limited capacity.

### Communication channel

channelchannelscommunications channel
Data compression is a way to reduce or eliminate unwanted redundancy, while checksums are a way of adding desired redundancy for purposes of error detection when communicating over a noisy channel of limited capacity.

### Entropy rate

ratesource information rate
In describing the redundancy of raw data, the rate of a source of information is the average entropy per symbol.

### Stochastic process

stochastic processesrandom processstochastic
For memoryless sources, this is merely the entropy of each symbol, while, in the most general case of a stochastic process, it is

### Joint entropy

joint
the limit, as n goes to infinity, of the joint entropy of the first n symbols divided by n.

### Logarithm

logarithmsloglogarithmic function
the logarithm of the cardinality of the message space, or alphabet.

### Cardinality

cardinalitiesnumber of elementssize
the logarithm of the cardinality of the message space, or alphabet.

### Hartley function

max-entropy
(This formula is sometimes called the Hartley function.) This is the maximum possible rate of information that can be transmitted with that alphabet.

### Discrete uniform distribution

uniform distributionuniformly distributeduniformly at random
(The logarithm should be taken to a base appropriate for the unit of measurement in use.) The absolute rate is equal to the actual rate if the source is memoryless and has a uniform distribution.

### Data compression ratio

compression ratiocompression ratios320 kbit/s
The quantity \frac D R is called the relative redundancy and gives the maximum possible data compression ratio, when expressed as the percentage by which a file size can be decreased.

### Total correlation

A measure of redundancy among many variables is given by the total correlation.

### Expected value

expectationexpectedmean
Redundancy of compressed data refers to the difference between the expected compressed data length of n messages L(M^n) \,\!

### Ergodicity

ergodicnon-ergodicergodic measure
(Here we assume the data is ergodic and stationary, e.g., a memoryless source.) Although the rate difference can be arbitrarily small as n \,\!

### Stationary process

stationarynon-stationarystationarity
(Here we assume the data is ergodic and stationary, e.g., a memoryless source.) Although the rate difference can be arbitrarily small as n \,\!

### Entropy encoding

entropy codingentropy codedentropy coder
Minimum redundancy coding

### Huffman coding

HuffmanHuffman codehuffman coded
Huffman encoding

### Negentropy

negative entropynegentropicsyntropy
Negentropy

### Shannon's source coding theorem

source coding theoremnoiseless coding
Source coding theorem

overcomplete
Overcompleteness