Units of information
Capacity of some standard data storage system or communication channel, used to measure the capacities of other systems and channels.
- Units of information20 related topics
Entropy (information theory)
Average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes.
The different units of information (bits for the binary logarithm
Ternary numeral system
A ternary numeral system (also called base 3 or trinary) has three as its base.
One trit is equivalent to log2 3 (about 1.58496) bits of information.
Octet (computing)
The octet is a unit of digital information in computing and telecommunications that consists of eight bits.
Logarithmic scale
Way of displaying numerical data over a very wide range of values in a compact way—typically the largest numbers in the data are hundreds or even thousands of times larger than the smallest numbers.
Examples of logarithmic units include units of data storage capacity (bit, byte), of information and information entropy (nat, shannon, ban), and of signal level (decibel, bel, neper).
Orders of magnitude (data)
Usually a factor of ten.
This article presents a list of multiples, sorted by orders of magnitude, for units of information measured in bits and bytes.
Binary logarithm
[[File:Binary logarithm plot with ticks.svg|thumbnail|right|upright=1.35|Graph of
In information theory, the definition of the amount of self-information and information entropy is often expressed with the binary logarithm, corresponding to making the bit the fundamental unit of information.
Chen–Ho encoding
Memory-efficient alternate system of binary encoding for decimal digits.
Chen–Ho encoding is limited to encoding sets of three decimal digits into groups of 10 bits (so called declets).