What are hyperparameters?
As we discussed in Chapter 2, hyperparameters are parameters that define aspects of how our model training jobs run. They are not the parameters in the dataset from which our models learn but rather external configuration options related to how the model training process is executed. They influence how the resulting models perform and they represent higher-level properties of the model, such as its complexity or how quickly it should learn.
The following are examples of hyperparameters that we’ve already discussed in this book:
- In our Chapter 2 discussion of hyperparameters, we covered examples such as learning rate and the number of epochs
- In Chapter 5, we configured the number of clusters as a hyperparameter for our K-means algorithm and we configured hyperparameters for our tree-based models, such as the maximum depth of our trees
- We talked about regularization in Chapter 7, and regularization parameters are another example of...