Information Theory, Inference and Learning Algorithms

Information Theory, Inference and Learning Algorithms

A textbook on information, communication, and coding for a new generation of students, and an entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

Publication date: 01 Mar 2005

ISBN-10: 0521642981

ISBN-13: n/a

Paperback: 550 pages

Views: 26,751

Type: N/A

Publisher: Cambridge University Press

License: n/a

Post time: 11 Aug 2006 11:43:28

Information Theory, Inference and Learning Algorithms

Information Theory, Inference and Learning Algorithms A textbook on information, communication, and coding for a new generation of students, and an entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.
Tag(s): Information Theory
Publication date: 01 Mar 2005
ISBN-10: 0521642981
ISBN-13: n/a
Paperback: 550 pages
Views: 26,751
Document Type: N/A
Publisher: Cambridge University Press
License: n/a
Post time: 11 Aug 2006 11:43:28
Terms and Conditions:
David J. C. MacKay wrote:Now the book is published, these files will remain viewable on this website. The same copyright rules will apply to the online copy of the book as apply to normal books. [e.g., copying the whole book onto paper is not permitted.]

Book excerpts:

Information theory and inference, often taught separately, are here united in one textbook. These topics lie at the heart of many areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography.

This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks.

The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast.

Intended Audience:

This book is aimed at senior undergraduates and graduate students in Engineering, Science, Mathematics, and Computing. It expects familiarity with calculus, probability theory, and linear algebra as taught in a first or second year undergraduate course on mathematics for scientists and engineers.

Review:

Amazon.com

:smile: "If you want to know what's presently going on in the field of coding theory with solid technical foundation, this is the book."

:smile: "For a course I help teach, the intoductions to probability theory and information theory save a lot of work."
 




About The Author(s)


David J. C. MacKay is Regius Professor of Engineering at Cambridge University Engineering Department. He works on Bayesian probability theory and its application to inference problems.

David J. C. MacKay

David J. C. MacKay is Regius Professor of Engineering at Cambridge University Engineering Department. He works on Bayesian probability theory and its application to inference problems.


Book Categories
Sponsors