Comparison of spectrograms of audio in an uncompressed format and several lossy formats. The lossy spectrograms show bandlimiting of higher frequencies, a common technique associated with lossy audio compression.
A picture showing scratches on the readable surface of a CD-R. Music and data CDs are coded using error correcting codes and thus can still be read even if they have minor scratches using error detection and correction.
Solidyne 922: The world's first commercial audio bit compression sound card for PC, 1990
Processing stages of a typical video encoder

Data compression is a way to reduce or eliminate unwanted redundancy, while forward error correction is a way of adding desired redundancy for purposes of error detection and correction when communicating over a noisy channel of limited capacity.

- Redundancy (information theory)

Lossless compression reduces bits by identifying and eliminating statistical redundancy.

- Data compression
Comparison of spectrograms of audio in an uncompressed format and several lossy formats. The lossy spectrograms show bandlimiting of higher frequencies, a common technique associated with lossy audio compression.

2 related topics

Alpha

A picture showing scratches on the readable surface of a CD-R. Music and data CDs are coded using error correcting codes and thus can still be read even if they have minor scratches using error detection and correction.

Information theory

Scientific study of the quantification, storage, and communication of digital information.

Scientific study of the quantification, storage, and communication of digital information.

A picture showing scratches on the readable surface of a CD-R. Music and data CDs are coded using error correcting codes and thus can still be read even if they have minor scratches using error detection and correction.

Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory and information-theoretic security.

the information entropy and redundancy of a source, and its relevance through the source coding theorem;

Two bits of entropy: In the case of two fair coin tosses, the information entropy in bits is the base-2 logarithm of the number of possible outcomes; with two coins there are four possible outcomes, and two bits of entropy. Generally, information entropy is the average amount of information conveyed by an event, when considering all possible outcomes.

Entropy (information theory)

Average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes.

Average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes.

Two bits of entropy: In the case of two fair coin tosses, the information entropy in bits is the base-2 logarithm of the number of possible outcomes; with two coins there are four possible outcomes, and two bits of entropy. Generally, information entropy is the average amount of information conveyed by an event, when considering all possible outcomes.

Information theory is useful to calculate the smallest amount of information required to convey a message, as in data compression.

See also Redundancy (information theory).