Dimensionality reduction
Sometimes, you will have to deal with datasets containing a large number of features, many of which may be unnecessary. This is a typical problem where you want to log as much as you can to either get enough information to properly predict the target variable, or just have more data in the future. Some features are very informative for the prediction, some are somehow related, and some are completely unrelated (that is, they only contain noise or irrelevant information).
Hence, dimensionality reduction is the operation of eliminating some features of the input dataset and creating a restricted set of features that contains all the information you need to predict the target variable in a more effective way. Reducing the number of features usually also reduces the output variability and complexity (as well as the time).
The main hypothesis behind many algorithms used in the reduction is the one pertaining to Additive White Gaussian Noise (AWGN) noise. It is an independent...