A Mathematical Theory of Communication

The Mathematical Theory of Communicationcommunication theoreticMathematical Theory of Communicationrelay and switch circuitsShannon (1948)Shannon's thesisTheory
"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948.wikipedia
50 Related Articles

Claude Shannon

Claude E. ShannonShannonClaude Elwood Shannon
"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948.
Shannon is noted for having founded information theory with a landmark paper, "A Mathematical Theory of Communication", that he published in 1948.

Information theory

information-theoreticinformation theoristinformation
The article was the founding work of the field of information theory.
It was originally proposed by Claude Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper titled "A Mathematical Theory of Communication".

Entropy (information theory)

entropyinformation entropyShannon entropy
It also developed the concepts of information entropy and redundancy, and introduced the term bit (which Shannon credited to John Tukey) as a unit of information.
The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication".

Bell Labs Technical Journal

Bell System Technical JournalBell Syst. Tech. J.BSTJ
"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948.

Warren Weaver

WeaverWeaver, W.Weaver, Warren
The book contains an additional article by Warren Weaver, providing an overview of the theory for a more general audience.
When Claude Shannon's landmark 1948 articles on communication theory were republished in 1949 as The Mathematical Theory of Communication, the book also republished a much shorter article authored by Weaver, which discusses the implications of Shannon's more technical work for a general audience.

Bit

bitsbinary digitbinary digits
It also developed the concepts of information entropy and redundancy, and introduced the term bit (which Shannon credited to John Tukey) as a unit of information.
Claude E. Shannon first used the word "bit" in his seminal 1948 paper "A Mathematical Theory of Communication".

Shannon–Fano coding

Shannon-Fano codingShannon-FanoShannon-Fano codes
It was also in this paper that the Shannon–Fano coding technique was proposed – a technique developed in conjunction with Robert Fano.
The technique was proposed in Shannon's "A Mathematical Theory of Communication", his 1948 article introducing the field of information theory.

Mathematician

mathematiciansapplied mathematicianMathematics
"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948.

Paperback

trade paperbackSoftcovermass market paperback
It was later published in 1949 as a book titled The Mathematical Theory of Communication (ISBN: 0-252-72546-8), which was published as a paperback in 1963 (ISBN: 0-252-72548-4).

Redundancy (information theory)

redundancyredundantdata redundancy
It also developed the concepts of information entropy and redundancy, and introduced the term bit (which Shannon credited to John Tukey) as a unit of information.

John Tukey

John W. TukeyTukeyJohn Wilder Tukey
It also developed the concepts of information entropy and redundancy, and introduced the term bit (which Shannon credited to John Tukey) as a unit of information.

Robert Fano

FanoR. M. FanoRobert M. Fano
It was also in this paper that the Shannon–Fano coding technique was proposed – a technique developed in conjunction with Robert Fano.

List of Internet pioneers

Internet pioneerYogen DalalInternet pioneers
Claude Shannon (1916–2001) called the "father of modern information theory", published "A Mathematical Theory of Communication" in 1948.

History of information theory

The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.

Signal processing

signal analysissignalsignal processor
In 1948, Claude Shannon wrote the influential paper "A Mathematical Theory of Communication" which was published in the Bell System Technical Journal.

Variety (cybernetics)

VarietyLaw of Requisite VarietyAshby's law
He sees his approach as introductory to Shannon information Theory (1948) which deals with the case of "incessant fluctuations" or noise.

List of computer term etymologies

common slangComputer term etymologiesComputer terms etymology
* bit — first used by Claude E. Shannon in his seminal 1948 paper A Mathematical Theory of Communication.

Entropy in thermodynamics and information theory

Szilard engineentropyclose similarity
Shannon commented on the similarity upon publicizing information theory in A Mathematical Theory of Communication.

Coding theory

algebraic coding theorycodingchannel code
In 1948, Claude Shannon published "A Mathematical Theory of Communication", an article in two parts in the July and October issues of the Bell System Technical Journal.

Macy conferences

Macy conference
Similarly to how Shannon had previously proven with his work in relay and switch circuits, McCulloch and Pitts proved that neural networks were capable of carrying out any boolean algebra calculations.