In the winning solution of the KDD competition in 2009, the authors created new features by combining two or more variables using decision trees and then used those variables to train the winning predictive model. This technique is particularly useful to derive features that are monotonic with the target, which is convenient for linear models. The procedure consists of building a decision tree using a subset of the features, typically two or three at a time, and then using the prediction of the tree as a new feature.
Creating new features with decision trees not only creates monotonic relationships between features and target, but it also captures feature interactions, which is useful when building models that do not do so automatically, such as linear models.
In this recipe, we will learn how to create new features with decision trees...