Bit

bitsbinary digitbinary digitsbit pattern5-bitbinarybinary valuesdata bitMbit1-
The bit is a basic unit of information in information theory, computing, and digital communications.wikipedia
1,111 Related Articles

Information

informativeinputinputs
The bit is a basic unit of information in information theory, computing, and digital communications.
The bit is a typical unit of information, but other units such as the nat may be used.

Units of information

unit of informationdecletdibit
The bit is a basic unit of information in information theory, computing, and digital communications. As a unit of information, the bit has also been called a shannon, named after Claude E. Shannon.
The most commonly used units of data storage capacity are the bit, the capacity of a system that has only two states, and the byte (or octet), which is equivalent to eight bits.

Shannon (unit)

shannonshannons
As a unit of information, the bit has also been called a shannon, named after Claude E. Shannon.
The shannon (symbol: Sh), more commonly known as the bit, is a unit of information and of entropy defined by IEC 80000-13.

Byte

bytesBTB
A group of eight binary digits is commonly called one byte, but historically the size of the byte is not strictly defined. The most common is the unit byte, coined by Werner Buchholz in June 1956, which historically was used to represent the group of bits used to encode a single character of text (until UTF-8 multibyte encoding took over) in a computer and for this reason it was used as the basic addressable element in many computer architectures.
The byte is a unit of digital information that most commonly consists of eight bits.

Computer

computerscomputer systemdigital computer
The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program.
The Z3 was built with 2000 relays, implementing a 22 bit word length that operated at a clock frequency of about 5–10 Hz.

Information theory

information-theoreticinformation theoristinformation
The bit is a basic unit of information in information theory, computing, and digital communications.
A common unit of information is the bit, based on the binary logarithm.

John Tukey

John W. TukeyTukeyJohn Wilder Tukey
He attributed its origin to John W. Tukey, who had written a Bell Labs memo on 9 January 1947 in which he contracted "binary information digit" to simply "bit".
He is also credited with coining the term 'bit'.

IEEE Std 260.1-2004

IEEE 260.1IEEE 260.1:2004
The symbol for the binary digit is either simply bit per recommendation by the IEC 80000-13:2008 standard, or the lowercase character b, as recommended by the IEEE 1541-2002 and IEEE Std 260.1-2004 standards.
The symbols are sorted in alphabetical order of name from ampere (symbol A) to zetta (symbol Z), including barrel (symbol bbl), bit (symbol b), foot (symbol ft), kibibyte (symbol KiB) kilowatthour (symbol kWh), microinch (symbol μin), quart (symbol qt), slug (symbol slug) and year (symbol a).

Computer program

programprogramscomputer programs
The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program.
Only three bits of memory were available to store each instruction, so it was limited to eight instructions.

Flip-flop (electronics)

flip-flopflip-flopslatch
These may be the two stable states of a flip-flop, two positions of an electrical switch, two distinct voltage or current levels allowed by a circuit, two distinct levels of light intensity, two directions of magnetization or polarization, the orientation of reversible double stranded DNA, etc.
A flip-flop is a device which stores a single bit (binary digit) of data; one of its two states represents a "one" and the other represents a "zero".

Serial communication

serialserial interfaceserial communications
Bits are transmitted one at a time in serial transmission, and by a multiple number of bits in parallel transmission.
In telecommunication and data transmission, serial communication is the process of sending data one bit at a time, sequentially, over a communication channel or computer bus.

Parallel communication

parallelparallel busparallel transmission
Bits are transmitted one at a time in serial transmission, and by a multiple number of bits in parallel transmission.
In data transmission, parallel communication is a method of conveying multiple binary digits (bits) simultaneously.

IEEE 1541-2002

IEEE 1541IEEE 1541-2002 standard
The symbol for the binary digit is either simply bit per recommendation by the IEC 80000-13:2008 standard, or the lowercase character b, as recommended by the IEEE 1541-2002 and IEEE Std 260.1-2004 standards.
Moreover, there is not a consistent use of the symbols to indicate quantities of bits and bytes – the unit symbol "Mb", for instance, has been widely used for both megabytes and megabits.

Bitwise operation

bitwisebit shiftbitwise AND
A bitwise operation optionally processes bits one at a time.
In digital computer programming, a bitwise operation operates on one or more bit patterns or binary numerals at the level of their individual bits.

Optical disc

optical mediaoptical data storageoptical discs
When relays were replaced by vacuum tubes, starting in the 1940s, computer builders experimented with a variety of storage methods, such as pressure pulses traveling down a mercury delay line, charges stored on the inside surface of a cathode-ray tube, or opaque spots printed on glass discs by photolithographic techniques.
In computing and optical disc recording technologies, an optical disc (OD) is a flat, usually circular disc which encodes binary data (bits) in the form of pits (binary value of 0 or off, due to lack of reflection when read) and lands (binary value of 1 or on, due to a reflection when read) on a special material (often aluminium ) on one of its flat surfaces.

ISO/IEC 80000

IEC 80000-13ISO 80000-2ISO 80000-3
The symbol for the binary digit is either simply bit per recommendation by the IEC 80000-13:2008 standard, or the lowercase character b, as recommended by the IEEE 1541-2002 and IEEE Std 260.1-2004 standards.
The Standard also includes definitions for units relating to information technology, such as the erlang (E), bit (bit), octet (o), byte (B), baud (Bd), shannon (Sh), hartley (Hart) and the natural unit of information (nat).

Teleprinter

teletypeteletypewritertelex
The encoding of text by bits was also used in Morse code (1844) and early digital communications machines such as teletypes and stock ticker machines (1870).
Most teleprinters used the 5-bit International Telegraph Alphabet No. 2 (ITA2).

Magnetic-core memory

core memorymagnetic core memoryferrite core memory
In the 1950s and 1960s, these methods were largely supplanted by magnetic storage devices such as magnetic core memory, magnetic tapes, drums, and disks, where a bit was represented by the polarity of magnetization of a certain area of a ferromagnetic film, or by a change in polarity from one direction to the other.
Each core stores one bit of information.

Morse code

MorseInternational Morse CodeMorse-code
The encoding of text by bits was also used in Morse code (1844) and early digital communications machines such as teletypes and stock ticker machines (1870).
Working from the above ITU definition and further defining a bit as a dot time, a Morse code sequence may be made from a combination of the following five bit-strings:

Delay line memory

acoustic delay linedelay linesdelay line
When relays were replaced by vacuum tubes, starting in the 1940s, computer builders experimented with a variety of storage methods, such as pressure pulses traveling down a mercury delay line, charges stored on the inside surface of a cathode-ray tube, or opaque spots printed on glass discs by photolithographic techniques.
Early delay-line memory systems had capacities of a few thousand bits, with recirculation times measured in microseconds.

Dynamic random-access memory

DRAMvideo memorydynamic RAM
In modern semiconductor memory, such as dynamic random-access memory, the two values of a bit may be represented by two levels of electric charge stored in a capacitor.
Dynamic random-access memory (DRAM) is a type of random access semiconductor memory that stores each bit of data in a memory cell consisting of a tiny capacitor and a transistor, both typically based on metal-oxide-semiconductor (MOS) technology.

Punched card

punched cardspunch cardpunch cards
The encoding of data by discrete bits was used in the punched cards invented by Basile Bouchon and Jean-Baptiste Falcon (1732), developed by Joseph Marie Jacquard (1804), and later adopted by Semyon Korsakov, Charles Babbage, Hermann Hollerith, and early computer manufacturers like IBM.
For some computer applications, binary formats were used, where each hole represented a single binary digit (or "bit"), every column (or row) is treated as a simple bit field, and every combination of holes is permitted.

A Mathematical Theory of Communication

The Mathematical Theory of Communicationcommunication theoreticMathematical Theory of Communication
Claude E. Shannon first used the word "bit" in his seminal 1948 paper "A Mathematical Theory of Communication".
It also developed the concepts of information entropy and redundancy, and introduced the term bit (which Shannon credited to John Tukey) as a unit of information.

UTF-8

65001Unicode (UTF-8)AL32UTF8
The most common is the unit byte, coined by Werner Buchholz in June 1956, which historically was used to represent the group of bits used to encode a single character of text (until UTF-8 multibyte encoding took over) in a computer and for this reason it was used as the basic addressable element in many computer architectures.
UTF-8 (8-bit Unicode Transformation Format) is a variable width character encoding capable of encoding all 1,112,064 valid code points in Unicode using one to four 8-bit bytes.

Semiconductor memory

memory chipMOS memorydigital memory
In modern semiconductor memory, such as dynamic random-access memory, the two values of a bit may be represented by two levels of electric charge stored in a capacitor.
In a semiconductor memory chip, each bit of binary data is stored in a tiny circuit called a memory cell consisting of one to several transistors.