# A report onInformation theory

Scientific study of the quantification, storage, and communication of digital information.

- Information theory

## Claude Shannon

Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as a "father of information theory".

## Entropy (information theory)

In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes.

## Error detection and correction

In information theory and coding theory with applications in computer science and telecommunication, error detection and correction (EDAC) or error control are techniques that enable reliable delivery of digital data over unreliable communication channels.

## Data compression

In information theory, data compression, source coding, or bit-rate reduction is the process of encoding information using fewer bits than the original representation.

## Bell Labs

Nokia Bell Labs, originally named Bell Telephone Laboratories (1925–1984),

Nokia Bell Labs, originally named Bell Telephone Laboratories (1925–1984),

Researchers working at Bell Laboratories are credited with the development of radio astronomy, the transistor, the laser, the photovoltaic cell, the charge-coupled device (CCD), information theory, the Unix operating system, and the programming languages B, C, C++, S, SNOBOL, AWK, AMPL, and others.

## Information

That portion of the content of a signal or message which conveys meaning.

That portion of the content of a signal or message which conveys meaning.

Information theory takes advantage of this by concluding that more uncertain events require more information to resolve their uncertainty.

## Computer science

Study of computation, automation, and information.

Study of computation, automation, and information.

Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory and automation) to practical disciplines (including the design and implementation of hardware and software).

## Noisy-channel coding theorem

In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel.