By using the feature selection algorithm, we can reduce the number of features that will be used for training. We can achieve this by simply picking up features that seem to be useful to predict the target value efficiently. This is assumed to contribute to improving the accuracy, as well as the efficiency, of the computation as it remediates the curse of dimensionality.
Why dimensionality reduction?
Curse of dimensionality
The curse of dimensionality is a common problem where the number of necessary data points is exponentially increased when the dimension is increased. Let's say we have two datasets: one in a one-dimensional space and one in a three-dimensional space. If we want to achieve sufficient accuracy with 10...