Summary
In this chapter, we saw various feature selection and reduction techniques. The three main topics covered in this chapter were: Feature Engineering, Feature Selection, and Feature Reduction. The latter two have the same purpose of shrinking the number of features; however, the techniques used are completely different. Feature Engineering focuses on transforming variables into a new form that either helps in improving the model performance or makes the variable be in compliance with model assumption. An example is the linearity assumption in the linear regression model, where we typically could square or cube the variables and the skewness in data distribution, which could be addressed using log transformation. Feature Selection and Feature Reduction help in providing the best feature set or the best representation of the feature set, which improves model performance. Most importantly, both techniques shrink the feature space, which drastically improves the model training time without...