Comparison of units of information: bit, trit, nat, ban. Quantity of information is the height of bars. Dark green level is the "nat" unit.
Leonhard Euler was the first to apply binary logarithms to music theory, in 1739.
A 16-player single elimination tournament bracket with the structure of a complete binary tree. The height of the tree (number of rounds of the tournament) is the binary logarithm of the number of players, rounded up to an integer.
Binary search in a sorted array, an algorithm whose time complexity involves binary logarithms
A microarray for approximately 8700 genes. The expression rates of these genes are compared using binary logarithms.
HP-35 scientific calculator (1972). The log and ln keys are in the top row; there is no log2key.

Capacity of some standard data storage system or communication channel, used to measure the capacities of other systems and channels.

- Units of information

In information theory, the definition of the amount of self-information and information entropy is often expressed with the binary logarithm, corresponding to making the bit the fundamental unit of information.

- Binary logarithm
Comparison of units of information: bit, trit, nat, ban. Quantity of information is the height of bars. Dark green level is the "nat" unit.

1 related topic

Alpha

Two bits of entropy: In the case of two fair coin tosses, the information entropy in bits is the base-2 logarithm of the number of possible outcomes; with two coins there are four possible outcomes, and two bits of entropy. Generally, information entropy is the average amount of information conveyed by an event, when considering all possible outcomes.

Entropy (information theory)

Average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes.

Average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes.

Two bits of entropy: In the case of two fair coin tosses, the information entropy in bits is the base-2 logarithm of the number of possible outcomes; with two coins there are four possible outcomes, and two bits of entropy. Generally, information entropy is the average amount of information conveyed by an event, when considering all possible outcomes.

The different units of information (bits for the binary logarithm