Comparing principal component analysis with the Restricted Boltzmann machine
In this section, you will learn about two widely recommended dimensionality reduction techniques--Principal component analysis (PCA) and the Restricted Boltzmann machine (RBM). Consider a vector v in n-dimensional space. The dimensionality reduction technique essentially transforms the vector v into a relatively smaller (or sometimes equal) vector v' with m-dimensions (m<n). The transformation can be either linear or nonlinear.
PCA performs a linear transformation on features such that orthogonally adjusted components are generated that are later ordered based on their relative importance of variance capture. These m components can be considered as new input features, and can be defined as follows:
Vector v' =
Here, w and c correspond to weights (loading) and transformed components, respectively.
Unlike PCA, RBMs (or DBNs/autoencoders) perform non-linear transformations using connections between visible and hidden...