Terms and Conditions:
Jonathan I. Hall wrote:These notes are not intended for broad distribution. If you want to use them in any way, please contact me.
Claude Shannon's 1948 paper "A Mathematical Theory of Communication" gave birth to the twin disciplines of information theory and coding theory. The basic goal is efficient and reliable communication in an uncooperative (and possibly hostile) environment, in form of error-correcting codes.
To be efficient
, the transfer of information must not require a prohibitive amount of time and effort. To be reliable
, the received data stream must resemble the transmitted stream to within narrow tolerances. These two desires will always be at odds, and the fundamental problem is to reconcile them as best we can.
At an early stage the mathematical study of such questions broke into the two broad areas. Information theory is the study of achievable bounds for communication and is largely probabilistic and analytic in nature. Coding theory then attempts to realize the promise of these bounds by models which are constructed through mainly algebraic means. Shannon was primarily interested in the information theory. Shannon's colleague Richard Hamming had been laboring on error-correction for early computers even before Shannon's 1948 paper, and he made some of the first breakthroughs of coding theory.
Although these notes shall discuss these areas as mathematical subjects, it must always be remembered that the primary motivation for such work comes from its practical engineering applications. Mathematical beauty can not be the sole gauge of worth. Throughout this manuscript, one should concentrate on the algebra of coding theory, but keep in mind the fundamental bounds of information theory and the practical desires of engineering.
These notes were aimed at advanced undergraduate / beginning graduate as both a course and self-study text.