Tree-Based Regression Models
Linear models are not the only type of regression models. Another powerful technique is to use regression trees. Regression trees are based on the idea of a decision tree. A decision tree is a bit like a flowchart, where at each step you ask whether a variable is greater than or less than some value. After flowing through several of these steps, you reach the end of the tree and receive an answer for what value the prediction should be. The following figure illustrates the workings of regression trees:
Decision trees are interesting because they can pick up on trends in data that linear regression might miss or capture poorly. Whereas linear models assume a simple linear relationship between predictors and an outcome, regression trees result in step functions, which can fit certain kinds of relationships more accurately.
One important hyperparameter for regression trees is...