Summary
Transfer learning is one of the most common ways used to cut compute costs, save time, and get the best results. In this chapter, we learned how to build models with ResNet-50 and pre-trained BERT architectures using PyTorch Lightning.
We have built an image classifier and a text classifier, and along the way, we have covered some useful PyTorch Lightning life cycle methods. We have learned how to make use of pre-trained models to work on our customized datasets with less effort and a smaller number of training epochs. Even with very little model tuning, we were able to achieve decent accuracy.
While transfer learning methods work great, their limitations should also be borne in mind. They work incredibly well for language models because the given dataset's text is usually made up of the same English words as in your core training set. When the core training set is very different from your given dataset, performance suffers. For example, if you want to build an image...