Fine-Tuning
In the previous section, we learned how to apply transfer learning and use pretrained models to make predictions on our own dataset. With this approach, we froze the entire network and trained only the last few layers that were responsible for making the predictions. The convolutional layers stay the same, so all the filters are set in advance and you are just reusing them.
But if the dataset you are using is very different from ImageNet, these pretrained filters may not be relevant. In this case, even using transfer learning will not help your model accurately predict the right outcomes. There is a solution for this, which is to only freeze a portion of the network and train the rest of the model rather than just the top layers, just like we do with transfer learning.
In the early layers of the networks, the filters tend to be quite generic. For instance, you may find filters that detect horizontal or vertical lines at that stage. The filters closer to the end of...