Decision Trees
Like logistic regression, there is another popular classification technique that is very popular due to its simplicity and white-box nature. A decision tree is a simple flowchart that is represented in the form of a tree (an inverted tree). It starts with a root node and branches into several nodes, which can be traversed based on a decision, and ends with a leaf node where the final outcome is determined. Decision trees can be used for regression, as well as classification use cases. There are several variations of decision trees implemented in machine learning. A few popular choices are listed here:
Iterative Dichotomiser 3 (ID3)
Successor to ID3 (C4.5)
Classification and Regression Tree (CART)
CHi-squared Automatic Interaction Detector (CHAID)
Conditional Inference Trees (C Trees)
The preceding list is not exhaustive. There are other alternatives, and each of them has small variations in how they approach the tree creation process. In this chapter, we will limit our exploration...