Decision trees can be used for both classification and regression. Decision trees answer sequential questions with a yes/no, true/false response. Based upon those responses, the tree follows predetermined paths to reach its goal. Trees are more formally a version of what is known as a directed acyclic graph. Finally, a decision tree is built using the entire dataset and all features.
Here is an example of a decision tree. You may not know it as a decision tree, but for sure you know the process. Anyone for a doughnut?
As you can see, the flow of a decision tree starts at the top and works its way downward until a specific result is achieved. The root of the tree is the first decision that splits the dataset. The tree recursively splits the dataset according to what is known as the splitting metric at each node. Two of the most popular metrics are Gini Impurity...