Summary
Transfer learning has made a lot of progress possible in the world of NLP, where data is readily available, but labeled data is a challenge. We covered different types of transfer learning first. Then, we took pre-trained GloVe embeddings and applied them to the IMDb sentiment analysis problem, seeing comparable accuracy with a much smaller model that takes much less time to train.
Next, we learned about seminal moments in the evolution of NLP models, starting from encoder-decoder architectures, attention, and Transformer models, before understanding the BERT model. Using the Hugging Face library, we used a pre-trained BERT model and a custom model built on top of BERT for the purpose of sentiment classification of IMDb reviews.
BERT only uses the encoder part of the Transformer model. The decoder side of the stack is used in text generation. The next two chapters will focus on completing the understanding of the Transformer model. The next chapter will use the decoder...