In information theory, units of information are also used to measure information contained in messages and the entropy of random variables.- Units of information
The different units of information (bits for the binary logarithm- Entropy (information theory)
1 related topic
The bit is the most basic unit of information in computing and digital communications.
In information theory, one bit is the information entropy of a random binary variable that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable becomes known.