Redundancy (information theory)
redundancyredundantdata redundancyinformation redundancyredundancy in information theoryredundant bitsredundant informationredundantlystatistical redundancy
In Information theory, redundancy measures the fractional difference between the entropy H(X) of an ensemble X, and its maximum possible value.wikipedia
80 Related Articles
Data compression
compressionvideo compressioncompressed
Data compression is a way to reduce or eliminate unwanted redundancy, while checksums are a way of adding desired redundancy for purposes of error detection when communicating over a noisy channel of limited capacity.
Lossless compression reduces bits by identifying and eliminating statistical redundancy.


Error detection and correction
error correctionerror detectionerror-correction
Data compression is a way to reduce or eliminate unwanted redundancy, while checksums are a way of adding desired redundancy for purposes of error detection when communicating over a noisy channel of limited capacity.
All error-detection and correction schemes add some redundancy (i.e., some extra data) to a message, which receivers can use to check consistency of the delivered message, and to recover data that has been determined to be corrupted.
Information theory
information-theoreticinformation theoristinformation
In Information theory, redundancy measures the fractional difference between the entropy H(X) of an ensemble X, and its maximum possible value.
The rate of a source of information is related to its redundancy and how well it can be compressed, the subject of source coding.

Entropy (information theory)
entropyinformation entropyShannon entropy
In Information theory, redundancy measures the fractional difference between the entropy H(X) of an ensemble X, and its maximum possible value. In describing the redundancy of raw data, the rate of a source of information is the average entropy per symbol.
See also Redundancy (information theory).

Channel capacity
capacitydata capacityinformation capacity
Data compression is a way to reduce or eliminate unwanted redundancy, while checksums are a way of adding desired redundancy for purposes of error detection when communicating over a noisy channel of limited capacity.

Mutual information
Average Mutual Informationinformationalgorithmic mutual information
A measure of redundancy between two variables is the mutual information or a normalized variant.
In some cases a symmetric measure may be desired, such as the following redundancy measure:
International Standard Serial Number
ISSNISO 3297ISSN 0744-5105
In Information theory, redundancy measures the fractional difference between the entropy H(X) of an ensemble X, and its maximum possible value.


Checksum
checksumscheck sumcheck-sum
Data compression is a way to reduce or eliminate unwanted redundancy, while checksums are a way of adding desired redundancy for purposes of error detection when communicating over a noisy channel of limited capacity.
Communication channel
channelchannelscommunications channel
Data compression is a way to reduce or eliminate unwanted redundancy, while checksums are a way of adding desired redundancy for purposes of error detection when communicating over a noisy channel of limited capacity.

Entropy rate
rateSource information rate
In describing the redundancy of raw data, the rate of a source of information is the average entropy per symbol.
Stochastic process
stochastic processesstochasticrandom process
For memoryless sources, this is merely the entropy of each symbol, while, in the most general case of a stochastic process, it is



Joint entropy
joint
the limit, as n goes to infinity, of the joint entropy of the first n symbols divided by n.
Logarithm
logarithmsloglogarithmic function
the logarithm of the cardinality of the message space, or alphabet.






Cardinality
cardinalitiesnumber of elementssize
the logarithm of the cardinality of the message space, or alphabet.
Hartley function
Hartley entropyHartley informationmax-entropy
(This formula is sometimes called the Hartley function.) This is the maximum possible rate of information that can be transmitted with that alphabet.
Discrete uniform distribution
uniform distributionuniformly distributeduniformly at random
(The logarithm should be taken to a base appropriate for the unit of measurement in use.) The absolute rate is equal to the actual rate if the source is memoryless and has a uniform distribution.
Data compression ratio
compression ratiocompression ratios320 kbit/s
The quantity \frac D R is called the relative redundancy and gives the maximum possible data compression ratio, when expressed as the percentage by which a file size can be decreased.
Total correlation
A measure of redundancy among many variables is given by the total correlation.
Expected value
expectationexpectedmean
Redundancy of compressed data refers to the difference between the expected compressed data length of n messages L(M^n) \,\!
Ergodicity
ergodicergodic measurenon-ergodic
(Here we assume the data is ergodic and stationary, e.g., a memoryless source.) Although the rate difference can be arbitrarily small as n \,\!
Stationary process
stationarynon-stationarystationarity
(Here we assume the data is ergodic and stationary, e.g., a memoryless source.) Although the rate difference can be arbitrarily small as n \,\!

Entropy encoding
entropy codingentropy codedentropy coder
Huffman coding
HuffmanHuffman codeHuffman encoding

Negentropy
negative entropynegentropicsyntropy
Shannon's source coding theorem
source coding theoremShannon's noiseless coding theoremnoiseless coding