Capacity of some standard data storage system or communication channel, used to measure the capacities of other systems and channels.

- Units of informationIn information theory, the definition of the amount of self-information and information entropy is often expressed with the binary logarithm, corresponding to making the bit the fundamental unit of information.

- Binary logarithm1 related topic

Alpha

## Entropy (information theory)

Average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes.

Average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes.

The different units of information (bits for the binary logarithm