Estimating with decision tree regression
Decision tree regression is also called regression tree. It is easy to understand a regression tree by comparing it with its sibling, the classification tree, which you are already familiar with.
Transitioning from classification trees to regression trees
In classification, a decision tree is constructed by recursive binary splitting and growing each node into left and right children. In each partition, it greedily searches for the most significant combination of features and its value as the optimal splitting point. The quality of separation is measured by the weighted purity of labels of the two resulting children, specifically via Gini Impurity or Information Gain. In regression, the tree construction process is almost identical to the classification one, with only two differences due to the fact that the target becomes continuous:
- The quality of the splitting point is now measured by the weighted MSE...