During model training, two problems come up quite often: underfitting and overfitting. Let's learn about them next:
- Underfitting: When our model performs poorly on both training and test data, it is said to be underfitting. This basically means that the model was not able to capture patterns or underlying trends in our data, and so it could not generalize well when working with unseen data. For such models, we can try out the tuning of various hyperparameters so that it can fit data well. In the case of neural networks, we can add more layers and create a bigger network so that the model can capture complex patterns in data.
- Overfitting: Overfitting is another problem that can happen during model training. When the model performs very well on training data, but does not generalize well and performs poorly on test data, it is said to be overfitting. Basically, the model is trying to memorize data here rather than learn patterns. It can, at times, model...