Channel capacity

capacitydata capacityinformation capacityShannon capacitycapacity of channelchannel codingAWGN channelcapacity of the channelchannel capacitieschannel code
Channel capacity, in electrical engineering, computer science, and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel.wikipedia
142 Related Articles

Information theory

information-theoreticinformation theoristinformation
Channel capacity, in electrical engineering, computer science, and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel. Information theory, developed by Claude E. Shannon during World War II, defines the notion of channel capacity and provides a mathematical model by which one can compute it.
Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL).

Entropy (information theory)

entropyinformation entropyShannon entropy
Following the terms of the noisy-channel coding theorem, the channel capacity of a given channel is the highest information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability.
The entropy provides an absolute limit on the shortest possible average length of a lossless compression encoding of the data produced by a source, and if the entropy of the source is less than the channel capacity of the communication channel, the data generated by the source can be reliably communicated to the receiver (at least in theory, possibly neglecting some practical considerations such as the complexity of the system needed to convey the data and the amount of time it may take for the data to be conveyed).

Noisy-channel coding theorem

Shannon limitnoisy channelShannon's theorem
Following the terms of the noisy-channel coding theorem, the channel capacity of a given channel is the highest information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability.
The Shannon theorem states that given a noisy channel with channel capacity C and information transmitted at a rate R, then if R < C there exist codes that allow the probability of error at the receiver to be made arbitrarily small.

Mutual information

Average Mutual Informationinformationalgorithmic mutual information
The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution.
Directed information has many applications in problems where causality plays an important role, such as capacity of channel with feedback.

Signal-to-noise ratio

signal to noise ratioSNRsignal-to-noise
An application of the channel capacity concept to an additive white Gaussian noise (AWGN) channel with B Hz bandwidth and signal-to-noise ratio S/N is the Shannon–Hartley theorem:
The signal-to-noise ratio, the bandwidth, and the channel capacity of a communication channel are connected by the Shannon–Hartley theorem.

Shannon–Hartley theorem

Shannon-Hartley theoremHartley's lawShannon limit
An application of the channel capacity concept to an additive white Gaussian noise (AWGN) channel with B Hz bandwidth and signal-to-noise ratio S/N is the Shannon–Hartley theorem:
The theorem establishes Shannon's channel capacity for such a communication link, a bound on the maximum amount of error-free information per time unit that can be transmitted with a specified bandwidth in the presence of the noise interference, assuming that the signal power is bounded, and that the Gaussian noise process is characterized by a known power or power spectral density.

Communication channel

channelchannelscommunications channel
Channel capacity, in electrical engineering, computer science, and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel. Following the terms of the noisy-channel coding theorem, the channel capacity of a given channel is the highest information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability.

Bandwidth (signal processing)

bandwidthbandwidthssignal bandwidth
An application of the channel capacity concept to an additive white Gaussian noise (AWGN) channel with B Hz bandwidth and signal-to-noise ratio S/N is the Shannon–Hartley theorem:
In the context of Nyquist symbol rate or Shannon-Hartley channel capacity for communication systems it refers to passband bandwidth.

Bandwidth (computing)

bandwidthnetwork bandwidthInternet bandwidth
The term bandwidth sometimes defines the net bit rate 'peak bit rate', 'information rate,' or physical layer 'useful bit rate', channel capacity, or the maximum throughput of a logical or physical communication path in a digital communication system.

Claude Shannon

Claude E. ShannonShannonClaude Elwood Shannon
Information theory, developed by Claude E. Shannon during World War II, defines the notion of channel capacity and provides a mathematical model by which one can compute it.

Additive white Gaussian noise

AWGNadditive noiseadditive
An application of the channel capacity concept to an additive white Gaussian noise (AWGN) channel with B Hz bandwidth and signal-to-noise ratio S/N is the Shannon–Hartley theorem:
Therefore, the channel capacity for the power-constrained channel is given by:

Error exponent

In information theory, the error exponent of a channel code or source code over the block length of the code is the rate at which the error probability decays exponentially with the block length of the code.

Bit rate

bitratedata ratedata transfer rate
The channel capacity, also known as the Shannon capacity, is a theoretical upper bound for the maximum net bitrate, exclusive of forward error correction coding, that is possible without bit errors for a certain physical analog node-to-node communication link.

Cooperative diversity

group cooperative relay
Cooperative diversity is a cooperative multiple antenna technique for improving or maximising total network channel capacities for any given set of bandwidths which exploits user diversity by decoding the combined signal of the relayed signal and the direct signal in wireless multihop networks.

Redundancy (information theory)

redundancyredundantdata redundancy
Data compression is a way to reduce or eliminate unwanted redundancy, while checksums are a way of adding desired redundancy for purposes of error detection when communicating over a noisy channel of limited capacity.

Throughput

maximum throughputasymptotic bandwidthBandwidth
This number is closely related to the channel capacity of the system, and is the maximum possible quantity of data that can be transmitted under ideal circumstances.

MIMO

multiple-input multiple-outputMultiple-input multiple-output communicationsmultiple-input and multiple-output
For channel capacity in systems with multiple antennas, see the article on MIMO.
Referring to information theory, the ergodic channel capacity of MIMO systems where both the transmitter and the receiver have perfect instantaneous channel state information is :

Spectral efficiency

system spectral efficiencylink spectral efficiencySpectral efficiency comparison table

Electrical engineering

electrical engineerelectricalElectrical and Electronics Engineering
Channel capacity, in electrical engineering, computer science, and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel.

Computer science

computer scientistcomputer sciencescomputer scientists
Channel capacity, in electrical engineering, computer science, and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel.

Upper and lower bounds

upper boundlower boundbound
Channel capacity, in electrical engineering, computer science, and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel.

Information

informativeinputinputs
Channel capacity, in electrical engineering, computer science, and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel.

World War II

Second World WarwarWWII
Information theory, developed by Claude E. Shannon during World War II, defines the notion of channel capacity and provides a mathematical model by which one can compute it.

Conditional probability distribution

conditional distributionconditional densityconditional
Furthermore, let be the conditional probability distribution function of Y given X, which is an inherent fixed property of the communication channel.

Marginal distribution

marginal probabilitymarginalmarginals
Then the choice of the marginal distribution p_X(x) completely determines the joint distribution due to the identity