Exploring regression tree
The regression tree is very similar to a classification tree. A regression tree takes numerical features as input and predicts another numerical variable.
Note
It is perfectly fine to have mix-type features – for example, some of them are discrete and some of them are continuous. We won't cover these examples due to space limitations, but they are straightforward.
There are two very important visible differences:
- The output is not discrete labels but rather numerical values.
- The splitting rules are not similar to yes-or-no questions. They are usually inequalities for values of certain features.
In this section, we will just use a one-feature dataset to build a regression tree that the logistic regression classifier won't be able to classify. I created an artificial dataset with the following code snippet:
def price_2_revenue(price): Â Â Â Â if price < 85: Â Â Â Â Â Â ...