Entropy encoding

entropy codingentropy codedentropy codercodingentropy codeentropy encodedentropy encoderentropy encodersentropy-codedinverse to the frequency of occurrence
In information theory an entropy encoding is a lossless data compression scheme that is independent of the specific characteristics of the medium.wikipedia
93 Related Articles

Prefix code

prefix-free codeprefix codesprefix-free
One of the main types of entropy coding creates and assigns a unique prefix-free code to each unique symbol that occurs in the input.
(This is closely related to minimizing the entropy.) This is a form of lossless data compression based on entropy encoding.

Arithmetic coding

arithmetic coderarithmetic encodingarithmetic code
Two of the most common entropy encoding techniques are Huffman coding and arithmetic coding. Since 2014, data compressors have started using the Asymmetric Numeral Systems family of entropy coding techniques, which allows combination of the compression ratio of arithmetic coding with a processing cost similar to Huffman coding.
Arithmetic coding is a form of entropy encoding used in lossless data compression.

Huffman coding

HuffmanHuffman codeHuffman encoding
Two of the most common entropy encoding techniques are Huffman coding and arithmetic coding. Since 2014, data compressors have started using the Asymmetric Numeral Systems family of entropy coding techniques, which allows combination of the compression ratio of arithmetic coding with a processing cost similar to Huffman coding.
As in other entropy encoding methods, more common symbols are generally represented using fewer bits than less common symbols.

Entropy (information theory)

entropyinformation entropyShannon entropy
These entropy encoders then compress data by replacing each fixed-length input symbol with the corresponding variable-length prefix-free output codeword.

Unary coding

thermometer-codedunary codeUnary encoding
These static codes include universal codes (such as Elias gamma coding or Fibonacci coding) and Golomb codes (such as unary coding or Rice coding).
Unary coding, or the unary numeral system and also sometimes called thermometer code, is an entropy encoding that represents a natural number, n, with n ones followed by a zero (if natural number is understood as non-negative integer) or with n − 1 ones followed by a zero (if natural number is understood as strictly positive integer).

Asymmetric numeral systems

Finite State EntropyrANSANS
Since 2014, data compressors have started using the Asymmetric Numeral Systems family of entropy coding techniques, which allows combination of the compression ratio of arithmetic coding with a processing cost similar to Huffman coding.
Asymmetric numeral systems (ANS) is a family of entropy encoding methods introduced by Jarosław (Jarek) Duda from Jagiellonian University, used in data compression since 2014 due to improved performance compared to previously used methods, being up to 30 times faster.

Golomb coding

Rice codingGolomb codeGolomb Rice code
These static codes include universal codes (such as Elias gamma coding or Fibonacci coding) and Golomb codes (such as unary coding or Rice coding).
Rice coding is used as the entropy encoding stage in a number of lossless image compression and audio data compression methods.

Range encoding

range encoderrange codingrange coder
Range encoding is an entropy coding method defined by G. Nigel N. Martin in a 1979 paper, which effectively rediscovered the FIFO arithmetic code first introduced by Richard Clark Pasco in 1976.

Fibonacci coding

Fibonacci codeFibonacciFibonacci representation
These static codes include universal codes (such as Elias gamma coding or Fibonacci coding) and Golomb codes (such as unary coding or Rice coding).
For general constraints defining which symbols are allowed after a given symbol, the maximal information rate can be obtained by first finding the optimal transition probabilities using Maximal Entropy Random Walk, then use entropy coder (with switched encoder with decoder) to encode a message as a sequence of symbols fulfilling the found optimal transition probabilities.

Information theory

information-theoreticinformation theoristinformation
In information theory an entropy encoding is a lossless data compression scheme that is independent of the specific characteristics of the medium.

Symbol rate

symbolbaud rateSR
One of the main types of entropy coding creates and assigns a unique prefix-free code to each unique symbol that occurs in the input.

Proportionality (mathematics)

proportionalinversely proportionalproportion
The length of each codeword is approximately proportional to the negative logarithm of the probability.

Logarithm

logarithmsloglogarithmic function
The length of each codeword is approximately proportional to the negative logarithm of the probability.

Probability

probabilisticprobabilitieschance
The length of each codeword is approximately proportional to the negative logarithm of the probability.

Claude Shannon

Claude E. ShannonShannonClaude Elwood Shannon
According to Shannon's source coding theorem, the optimal code length for a symbol is −log b P, where b is the number of symbols used to make output codes and P is the probability of the input symbol.

Shannon's source coding theorem

source coding theoremShannon's noiseless coding theoremnoiseless coding
According to Shannon's source coding theorem, the optimal code length for a symbol is −log b P, where b is the number of symbols used to make output codes and P is the probability of the input symbol.

Signal compression

Signal compression (disambiguation)
If the approximate entropy characteristics of a data stream are known in advance (especially for signal compression), a simpler static code may be useful.

Universal code (data compression)

universal codeuniversal codesUniversal coding
These static codes include universal codes (such as Elias gamma coding or Fibonacci coding) and Golomb codes (such as unary coding or Rice coding).

Elias gamma coding

Elias gamma (γ) codeElias gamma (γ) codingElias gamma code
These static codes include universal codes (such as Elias gamma coding or Fibonacci coding) and Golomb codes (such as unary coding or Rice coding).

Similarity measure

measure of similaritysimilarity matrixsimilarity
Besides using entropy encoding as a way to compress digital data, an entropy encoder can also be used to measure the amount of similarity between streams of data and already existing classes of data.

Data stream

datadata sourcedata-stream format
Besides using entropy encoding as a way to compress digital data, an entropy encoder can also be used to measure the amount of similarity between streams of data and already existing classes of data.

Statistical classification

classificationclassifierclassifiers
This is done by generating an entropy coder/compressor for each class of data; unknown data is then classified by feeding the uncompressed data to each compressor and seeing which compressor yields the highest compression.

Context-adaptive binary arithmetic coding

CABACCABAC entropy coding
Context-adaptive binary arithmetic coding (CABAC) is a form of entropy encoding used in the H.264/MPEG-4 AVC and High Efficiency Video Coding (HEVC) standards.

FFmpeg

SnowLAV Filterslibavformat
The logo uses a zigzag pattern that shows how MPEG video codecs handle entropy encoding.