Summary
In this chapter, we analyzed three different approaches to component extraction. FAÂ assumes that we have a small number of Gaussian latent variables and a Gaussian decorrelated noise term. The only restriction on the noise is to have a diagonal covariance matrix, so two different scenarios are possible. When we are in the presence of heteroscedastic noise, the process is an actual FA. When, instead, the noise is homoscedastic, the algorithm becomes the equivalent of a PCA. In this case, the process is equivalent to check the sample space in order to find the directions where the variance is higher. Selecting only the most important directions, we can project the original dataset onto a low-dimensional subspace, where the covariance matrix becomes decorrelated.
One of the problems of both FA and PCA is their assumption to model the latent variables with Gaussian distributions. This choice simplifies the model but, at the same time, yields dense representations where...