In the previous chapter, we learned about leveraging novel architectures when there are a minimal number of data points. In this chapter, we will switch gears and learn about how a Convolutional Neural Network (CNN) can be used in conjunction with algorithms in the broad family of Recurrent Neural Networks (RNNs), which are heavily used (as of the time of writing this book) in Natural Language Processing (NLP) to develop solutions that leverage both computer vision and NLP.
To understand combining CNNs and RNNs, we will first learn about how RNNs work and their variants – primarily Long Short-Term Memory (LSTM) – to understand how they are applied to predict annotations given an image as input. After that, we will learn about another important loss function, called the Connectionist Temporal Classification (CTC) loss function...