Chapter 1, Thinking Probabilistically, covers the basic concepts of Bayesian statistics and its implications for data analysis. This chapter contains most of the foundational ideas used in the rest of the book.
Chapter 2, Programming Probabilistically, revisits the concepts from the previous chapter from a more computational perspective. The PyMC3 probabilistic programming library is introduced, as well as ArviZ, a Python library for exploratory analysis of Bayesian models. Hierarchical models are explained with a couple of examples.
Chapter 3, Modeling with Linear Regression, covers the basic elements of linear regression, a very widely used model and the building block of more complex models.
Chapter 4, Generalizing Linear Models, covers how to expand linear models with other distributions than the Gaussian, opening the door to solving many data analysis problems.
Chapter 5, Model Comparison, discusses how to compare, select, and average models using WAIC, LOO, and Bayes factors. The general caveats of these methods are discussed.
Chapter 6, Mixture Models, discusses how to add flexibility to models by mixing simpler distributions to build more complex ones. The first non-parametric model in the book is also introduced: the Dirichlet process.
Chapter 7, Gaussian Processes, cover the basic idea behind Gaussian processes and how to use them to build non-parametric models over functions for a wide array of problems.
Chapter 8, Inference Engines, provides an introduction to methods for numerically approximating the posterior distribution, as well as a very important topic from the practitioner's perspective: how to diagnose the reliability of the approximated posterior.
Chapter 9, Where To Go Next?, provides a list of resources for you to keep learning from beyond this book, and a very short farewell speech.