|Listed in category:
This item is out of stock.
Have one to sell?

Information Theory, Inference and Learning Algorithms by David J C MacKay: Used

US $54.60
ApproximatelyS$ 70.15
Condition:
Good
Shipping:
Free Standard Shipping.
Located in: Sparks, Nevada, United States
Delivery:
Estimated between Thu, 11 Sep and Thu, 18 Sep to 94104
Estimated delivery dates - opens in a new window or tab include seller's handling time, origin ZIP Code, destination ZIP Code and time of acceptance and will depend on shipping service selected and receipt of cleared paymentcleared payment - opens in a new window or tab. Delivery times may vary, especially during peak periods.
Returns:
30 days return. Buyer pays for return shipping. If you use an eBay shipping label, it will be deducted from your refund amount.
Coverage:
Read item description or contact seller for details. See all detailsSee all details on coverage
(Not eligible for eBay purchase protection programmes)
Seller assumes all responsibility for this listing.
eBay item number:285037302210
Last updated on Sep 06, 2025 12:48:18 SGTView all revisionsView all revisions

Item specifics

Condition
Good: A book that has been read but is in good condition. Very minimal damage to the cover including ...
Book Title
Information Theory, Inference and Learning Algorithms
Publication Date
2003-10-06
Pages
640
ISBN
9780521642989

About this product

Product Identifiers

Publisher
Cambridge University Press
ISBN-10
0521642981
ISBN-13
9780521642989
eBay Product ID (ePID)
1885345

Product Key Features

Number of Pages
640 Pages
Publication Name
Information Theory, Inference and Learning Algorithms
Language
English
Publication Year
2003
Subject
Information Theory, Algebra / General, Logic, Computer Vision & Pattern Recognition
Type
Textbook
Author
David J. C. Mackay
Subject Area
Mathematics, Philosophy, Computers
Format
Hardcover

Dimensions

Item Height
1.3 in
Item Weight
53.9 Oz
Item Length
10 in
Item Width
7.7 in

Additional Product Features

Intended Audience
College Audience
LCCN
2003-055133
Reviews
"This is an extraordinary and important book, generous with insight and rich with detail in statistics, information theory, and probabilistic modeling across a wide swathe of standard, creatively original, and delightfully quirky topics. David MacKay is an uncompromisingly lucid thinker, from whom students, faculty and practitioners all can learn." Peter Dayan and Zoubin Ghahramani, Gatsby Computational Neuroscience Unit, University College, London, ‘This is primarily an excellent textbook in the areas of information theory, Bayesian inference and learning algorithms. Undergraduates and postgraduates students will find it extremely useful for gaining insight into these topics; however, the book also serves as a valuable reference for researchers in these areas. Both sets of readers should find the book enjoyable and highly useful.’David Saad, Aston University, "Most of the theories are accompanied by motivations, and explanations with the corresponding examples...the book achieves its goal of being a good textbook on information theory." SIAM Review, '... a quite remarkable work ... the treatment is specially valuable because the author has made it completely up-to-date ... this magnificent piece of work is valuable in introducing a new integrated viewpoint, and it is clearly an admirable basis for taught courses, as well as for self-study and reference. I am very glad to have it on my shelves.' Robotica, "...an impressive book, intended as a class text on the subject of the title but having the character and robustness of a focused encyclopedia. The presentation is finely detailed, well documented, and stocked with artistic flourishes." Mathematical Reviews, Advance Praise: 'This is an extraordinary and important book, generous with insight and rich with detail in statistics, information theory, and probabilistic modeling across a wide swathe of standard, creatively original, and delightfully quirky topics. David MacKay is an uncompromisingly lucid thinker, from whom students, faculty and practitioners all can learn.' Peter Dayan and Zoubin Ghahramani, Gatsby Computational Neuroscience Unit, University College, London, 'An instant classic, covering everything from Shannon's fundamental theorems to the postmodern theory of LDPC codes. You'll want two copies of this astonishing book, one for the office and one for the fireside at home.' Bob McEliece, California Institute of Technology, 'With its breadth, accessibility and handsome design, this book should prove to be quite popular. Highly recommended as a primer for students with no background in coding theory, the set of chapters on error correcting codes are an excellent brief introduction to the elements of modern sparse graph codes: LDPC, turbo, repeat-accumulate and fountain codes are described clearly and succinctly.' IEEE Transactions on Information Theory, ‘With its breadth, accessibility and handsome design, this book should prove to be quite popular. Highly recommended as a primer for students with no background in coding theory, the set of chapters on error correcting codes are an excellent brief introduction to the elements of modern sparse graph codes: LDPC, turbo, repeat-accumulate and fountain codes are described clearly and succinctly.’IEEE Transactions on Information Theory, ‘… a quite remarkable work … the treatment is specially valuable because the author has made it completely up-to-date … this magnificent piece of work is valuable in introducing a new integrated viewpoint, and it is clearly an admirable basis for taught courses, as well as for self-study and reference. I am very glad to have it on my shelves.’Robotica, 'This is primarily an excellent textbook in the areas of information theory, Bayesian inference and learning algorithms. Undergraduates and postgraduates students will find it extremely useful for gaining insight into these topics; however, the book also serves as a valuable reference for researchers in these areas. Both sets of readers should find the book enjoyable and highly useful.' David Saad, Aston University, "An excellent textbook in the areas of infomation theory, Bayesian inference and learning alorithms. Undergraduate and post-graduate students will find it extremely useful for gaining insight into these topics." REDNOVA, "Essential reading for students of electrical engineering and computer science; also a great heads-up for mathematics students concerning the subtlety of many commonsense questions." Choice, '… a quite remarkable work … the treatment is specially valuable because the author has made it completely up-to-date … this magnificent piece of work is valuable in introducing a new integrated viewpoint, and it is clearly an admirable basis for taught courses, as well as for self-study and reference. I am very glad to have it on my shelves.' Robotica, "Most of the theories are accompanied by motivations, and explanations with the corresponding examples...the book achieves its goal of being a good textbook on information theory." ACM SIGACT News, 'An utterly original book that shows the connections between such disparate fields as information theory and coding, inference, and statistical physics.' Dave Forney, Massachusetts Institute of Technology, "An utterly original book that shows the connections between such disparate fields as information theory and coding, inference, and statistical physics." Dave Forney, Massachusetts Institute of Technology, 'This is an extraordinary and important book, generous with insight and rich with detail in statistics, information theory, and probabilistic modeling across a wide swathe of standard, creatively original, and delightfully quirky topics. David MacKay is an uncompromisingly lucid thinker, from whom students, faculty and practitioners all can learn.' Peter Dayan and Zoubin Ghahramani, Gatsby Computational Neuroscience Unit, University College, London, "An instant classic, covering everything from Shannon's fundamental theorems to the postmodern theory of LDPC codes. You'll want two copies of this astonishing book, one for the office and one for the fireside at home." Bob McEliece, California Institute of Technology
Illustrated
Yes
Table Of Content
1. Introduction to information theory; 2. Probability, entropy and inference; 3. More about inference; Part I. Data Compression: 4. The source coding theorem; 5. Symbol codes; 6. Stream codes; 7. Codes for integers; Part II. Noisy-Channel Coding: 8. Dependent random variables; 9. Communication over a noisy channel; 10. The noisy-channel coding theorem; 11. Error-correcting codes and real channels; Part III. Further Topics in Information Theory: 12. Hash codes; 13. Binary codes; 14. Very good linear codes exist; 15. Further exercises on information theory; 16. Message passing; 17. Constrained noiseless channels; 18. Crosswords and codebreaking; 19. Why have sex? Information acquisition and evolution; Part IV. Probabilities and Inference: 20. An example inference task: clustering; 21. Exact inference by complete enumeration; 22. Maximum likelihood and clustering; 23. Useful probability distributions; 24. Exact marginalization; 25. Exact marginalization in trellises; 26. Exact marginalization in graphs; 27. Laplace's method; 28. Model comparison and Occam's razor; 29. Monte Carlo methods; 30. Efficient Monte Carlo methods; 31. Ising models; 32. Exact Monte Carlo sampling; 33. Variational methods; 34. Independent component analysis; 35. Random inference topics; 36. Decision theory; 37. Bayesian inference and sampling theory; Part V. Neural Networks: 38. Introduction to neural networks; 39. The single neuron as a classifier; 40. Capacity of a single neuron; 41. Learning as inference; 42. Hopfield networks; 43. Boltzmann machines; 44. Supervised learning in multilayer networks; 45. Gaussian processes; 46. Deconvolution; Part VI. Sparse Graph Codes; 47. Low-density parity-check codes; 48. Convolutional codes and turbo codes; 49. Repeat-accumulate codes; 50. Digital fountain codes; Part VII. Appendices: A. Notation; B. Some physics; C. Some mathematics; Bibliography; Index.
Synopsis
Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning., This exciting and entertaining textbook is ideal for courses in information, communication and coding. It is an unparalleled entry point to these subjects for professionals working in areas as diverse as computational biology, data mining, financial engineering and machine learning., Information theory and inference, taught together in this exciting textbook, lie at the heart of many important areas of modern technology - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics and cryptography. The book introduces theory in tandem with applications. Information theory is taught alongside practical communication systems such as arithmetic coding for data compression and sparse-graph codes for error-correction. Inference techniques, including message-passing algorithms, Monte Carlo methods and variational approximations, are developed alongside applications to clustering, convolutional codes, independent component analysis, and neural networks. Uniquely, the book covers state-of-the-art error-correcting codes, including low-density-parity-check codes, turbo codes, and digital fountain codes - the twenty-first-century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, the book is ideal for self-learning, and for undergraduate or graduate courses. It also provides an unparalleled entry point for professionals in areas as diverse as computational biology, financial engineering and machine learning.
LC Classification Number
Q360 .M23 2003

Item description from the seller

About this seller

AlibrisBooks

98.6% positive feedback2.0M items sold

Joined May 2008
Usually responds within 24 hours
Alibris is the premier online marketplace for independent sellers of new & used books, as well as rare & collectible titles. We connect people who love books to thousands of independent sellers around ...
See more

Detailed Seller Ratings

Average for the last 12 months
Accurate description
4.9
Reasonable shipping cost
5.0
Shipping speed
5.0
Communication
5.0

Seller feedback (519,244)

All ratings
Positive
Neutral
Negative