Now that we have explored a wide variety of machine learning algorithms, I am sure you have realized that most of them come with a great number of settings to choose from. These settings or tuning knobs, the so-called hyperparameters, help us to control the behavior of the algorithm when we try to maximize performance.
For example, we might want to choose the depth or split criterion in a decision tree or tune the number of neurons in a neural network. Finding the values of important parameters of a model is a tricky task but necessary for almost all models and datasets.
In this chapter, we will dive deeper into model evaluation and hyperparameter tuning. Assume that we have two different models that might apply to our task, how can we know which one is better? Answering this question often involves repeatedly fitting different...