In this real-world case of predicting the failure of banks, we have a high number of variables or financial ratios to train a classifier, so we would expect to obtain a great predictive model. With this in mind, why would we want to select alternate variables and reduce their number?
Well, in some cases, increasing the dimensionality of the problem by adding new features could reduce the performance of our model. This is called the curse of dimensionality problem.
According to this problem, the fact of adding more features or increasing the dimensionality of our feature space will require collecting more data. In this sense, the new observations we need to collect have to grow exponentially quickly to maintain the learning process and to avoid overfitting.
This problem is commonly observed in cases in which the ratio between the number of variables...