Convolutional neural networks (CNNs) have been found to be useful in capturing high-level local features from data. On the other hand, recurrent neural networks (RNNs), such as long short-term memory (LSTM), have been found to be useful in capturing long-term dependencies in data involving sequences such as text. When we use CNNs and RNNs in the same model architecture, it gives rise to what's called convolutional recurrent neural networks (CRNNs).
This chapter illustrates how to apply convolutional recurrent neural networks to text classification problems by combining the advantages of RNNs and CNNs networks. The steps that are involved in this process include text data preparation, defining a convolutional recurrent network model, training the model, and model assessment.
More specifically, in this chapter...