What this book covers
Chapter 1, Probabilistic Reasoning, covers topics from the basic concepts of probabilities to PGMs as a generic framework to do tractable, efficient, and easy modeling with probabilistic models, through the presentation of the Bayes formula.
Chapter 2, Exact Inference, shows you how to build PGMs by combining simple graphs and perform queries on the model using an exact inference algorithm called the junction tree algorithm.
Chapter 3, Learning Parameters, includes fitting and learning the PGM models from data sets with the Maximum Likelihood approach.
Chapter 4, Bayesian Modeling – Basic Models, covers simple and powerful Bayesian models that can be used as building blocks for more advanced models and shows you how to fit and query them with adapted algorithms.
Chapter 5, Approximate Inference, covers the second way to perform an inference in PGM using sampling algorithms and a presentation of the main sampling algorithms such as MCMC.
Chapter 6, Bayesian Modeling – Linear Models, shows you a more Bayesian view of the standard linear regression algorithm and a solution to the problem of over-fitting.
Chapter 7, Probabilistic Mixture Models, goes over more advanced probabilistic models in which the data comes from a mixture of several simple models.
Appendix, References, includes all the books and articles which have been used to write this book.