On Cryptography, Coding and Information Theory
The word ‘Cryptography’ is in fact Greek and denotes ‘Secret Writing’ in that language. Essentially, Cryptography is the means by which two parties may communicate without having their message understood by a third party. The three necessary components for a ‘Cryptogram’ – also called a ‘Cipher’ are Authentication (the handshake and verification of the message source), Data Integrity (the ability to convey the message with no loss to it’s meaning or content) and Confidentiality (preventing third party adversaries from being able to understand it).
Since the Industrial Revolution and especailly World War I the means by which Cryptography was done has become increasingly complex. The Lorenz cipher machine, or ‘Enigma’ as it was called, used by the German High Command, was the very first machine to employ switching technology as an integral part of it’s special linguistic mechanism, or encryption.
In 1948 , a scientist working for Bell Laboratories in New Jersey named Claude Shannon wrote a paper called "A Mathematical Theory of Communication" for the Bell System Technical Journal. This paper built on his earlier interest in the Boolean (logical) analysis of electrical (in fact digital) signals studied in his thesis, titled An Algebra for Theoretical Genetics. By also substantially using the probability theory of mathematician Norbert Wiener, Shannon was able to exploring the intricacies of the best means of encoding information being sent electronically – not only for the purpose of secrecy and electrical efficacy – but this body of work also dealt with the problems of how to reconstruct messages due to lost information with no apparent loss of security to the sender. By adopting a special application of the concept of ‘entropy’, or ‘information entropy’.
By applying these tools – Shannon was able to establish mathematically, the level of security of (breakability) of various types and forms of encrypted messages and their ‘keys’ (a set of instructions for deciphering a given code). In his 1949 paper "Communication Theory of Secrecy Systems" Shannon devoted himself to a fuller mathematical study to the innate properties of digital cryptography. The results of this work had significant applications in many fields; Cryptography, Natural Language Processing, Data Compression and many others.
Years earlier, in 1940, Alan Turing (also known as the father of the logical machine and modern digital computer) worked with similar concepts as Shannon in the breaking of the German Enigma Machine cipher (or ‘code’). In this case, Turing was able to reverse engineer the actual structure of the German device based on it’s own electromechanical language of dials and switches, based on the logic of it’s transmissions. There was no currently existing theory or field of study which would have allowed for this insight – except for the mathematical work of Boltzmann and Gibbs, whose work on thermodynamics and predictive analyses – gave some insight into the problems faced by Turing.
An Introduction to Information Theory: Symbols, Signals & Noise , John Robinson Pierce