Information Theory, Inference, and Learning Algorithms

Sponsor Advertisement

My image
  • Author: David J. C. MacKay
  • Format: PDF, Postscript, DJVU, latex
  • Price: free

Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering – communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography.

This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks.

The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes — the twenty-first century standards for satellite communications, disk drives, and data broadcast.

Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay’s groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way.

In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

Chapters include:

  • Introduction to Information Theory
  • Probability, Entropy, and Inference
  • More about Inference
  • Data Compression
  • The Source Coding Theorem
  • Symbol Codes
  • Stream Codes
  • Codes for Integers
  • Noisy-Channel Coding
  • Dependent Random Variables
  • Communication over a Noisy Channel
  • The Noisy-Channel Coding Theorem
  • Error-Correcting Codes and Real Channels
  • Further Topics in Information Theory
  • Hash Codes: Codes for Efficient Information Retrieval
  • Binary Codes
  • Very Good Linear Codes Exist
  • Further Exercises on Information Theory
  • Message Passing
  • Communication over Constrained Noiseless Channels
  • Crosswords and Codebreaking
  • Why have Sex? Information Acquisition and Evolution
  • Probabilities and Inference
  • An Example Inference Task: Clustering
  • Exact Inference by Complete Enumeration
  • Maximum Likelihood and Clustering
  • Useful Probability Distributions
  • Exact Marginalization
  • Exact Marginalization in Trellises
  • Exact Marginalization in Graphs
  • Laplace’s Method
  • Model Comparison and Occam’s Razor
  • Monte Carlo Methods
  • Efficient Monte Carlo Methods
  • Ising Models
  • Exact Monte Carlo Sampling
  • Variational Methods
  • Independent Component Analysis and Latent Variable Modelling
  • Random Inference Topics
  • Decision Theory
  • Bayesian Inference and Sampling Theory
  • Neural networks
  • Introduction to Neural Networks
  • The Single Neuron as a Classifier
  • Capacity of a Single Neuron
  • Learning as Inference
  • Hopfield Networks
  • Boltzmann Machines
  • Supervised Learning in Multilayer Networks
  • Gaussian Processes
  • Deconvolution
  • Sparse Graph Codes
  • Low-Density Parity-Check Codes
  • Convolutional Codes and Turbo Codes
  • Repeat-Accumulate Codes
  • Digital Fountain Codes

http://www.inference.phy.cam.ac.uk/mackay/itprnn/book.html

Leave a Reply

Your email address will not be published. Required fields are marked *