There are probably a lot of other features to add, but let's now shift our attention to the model itself. For now, we assumed the default, static parameters of the model, restricting its max_depth parameter to an arbitrary number. Now, let's try to fine-tune those parameters. If done properly, this process could add a few additional percentage points to the model accuracy, and sometimes, even a small gain in performance metrics can be a game-changer.
To do this, we'll use RandomizedSearchCV—another wrapper around the concept of cross-validation, but this time, one that iterates over parameters of the model, trying to find the optimal ones. A simpler approach, called GridSearchCV, takes a finite number of parameters, creates all of the permutations, and runs them all iteratively using, essentially, a brute-force approach.
Randomized...