Dimensionality projection, or feature projection, consists of converting data in a high-dimensional space to a space of fewer dimensions.
High dimensionality increases the computational complexity substantially, and could even increase the risk of overfitting.
Dimensionality reduction techniques are useful for featuring selection as well. In this case, variables are converted into other new variables through different combinations. These combinations extract and summarize the relevant information from a complex database with fewer variables.
Different algorithms exist, with the following being the most important:
- Principal Component Analysis (PCA)
- Sammon mapping
- Singular value decomposition (SVD)
- Isomap
- Local linear embedding (LLE)
- Laplacian eigenmaps
- t-distributed Stochastic Neighbor Embedding (t-SNE)
Although dimensionality reduction is not very common...