Understanding the impact of varying the loss optimizer
So far, we have optimized loss based on the Adam optimizer. A loss optimizer helps to arrive at optimal weight values to minimize overall loss. There are a variety of loss optimizers (different ways of updating weight values to minimize loss values) that impact the overall loss and accuracy of a model. In this section, we will do the following:
- Modify the optimizer so that it becomes a Stochastic Gradient Descent (SGD) optimizer
- Revert to a batch size of 32 while fetching data in the DataLoader
- Increase the number of epochs to 10 (so that we can compare the performance of SGD and Adam over a longer number of epochs)
Making these changes means that only one step in the Batch size of 32 section will change (since the batch size is already 32 in that section); that is, we will modify the optimizer so that it’s the SGD optimizer.
Let’s modify the get_model
function in step 4 of the...