Trees, forests, and more trees
Probably the most popular classification and prediction algorithm is the decision trees algorithm. The algorithm gives quite good results and is easy to understand. The algorithm is also called recursive partitioning. You start with all the data in one group. Then you split the data with values of every single input variable, one by one. After each split, you check the distribution of the target variable in the new subgroups. You keep the split that gives you the purest subgroups in terms of the target variable and disregard all other splits. Then you split the subgroups again and again, until the purity of the target variable grows, or until some other stopping condition.
Decision trees use discrete variables. If some variables are continuous and the target variable is a continuous one as well, then you get the regression trees. Discrete variables are used for splits, and continuous variables for the regression formula in each branch of the tree. You get a...