In this chapter, we are going to introduce a very important algorithmic framework for many statistical learning tasks: the EM algorithm. Contrary to its name, this is not a method to solve a single problem, but a methodology that can be applied in several contexts. Our goal is to explain the rationale and show the mathematical derivation, together with some practical examples. In particular, we are going to discuss the following topics:
- Maximum Likelihood Estimation (MLE) and Maximum A Posteriori (MAP) learning approaches
- The EM algorithm with a simple application for the estimation of unknown parameters
- The Gaussian mixture algorithm, which is one the most famous EM applications
- Factor analysis
- Principal Component Analysis (PCA)
- Independent Component Analysis (ICA)
- A brief explanation of the Hidden Markov Models (HMMs) forward-backward...