Feature Selection
There are two types of feature selection techniques: forward selection and backward selection.
- Forward Selection: This is an approach that can be used for a labeled dataset. Basically, we start with one feature and build the model. We add more features in an incremental fashion and make a note of the accuracy as we go. We then select the combination of features that gave the highest level of accuracy while training the model. One con of this technique is that for a dataset with a large set of features, this is an extremely time-consuming process. Also, if an already-added feature is causing degradation of the performance of the model, we will not know it.
- Backward Selection: In this approach, we will need a labeled dataset. All the features will be used to build the model. We will iteratively remove features to observe the performance of the model. We can then select the best combination (the combination that produced the highest performance). The con of this approach...