Using gated recurrent units (GRUs)
Another type of unit used in RNNs is gated recurrent units (GRUs). These units are actually simpler than LSTM units, because they only have two gates: update and reset. The update gate determines the memory and the reset gate combines the memory with the current input. The flow of data is made visual in the following figure:
Figure 4.3: Example flow in a GRU unit
In this recipe, we will show how to incorporate a GRU into an RNN architecture to classify text with Keras.
How to do it...
- Let's start with importing the libraries as follows:
import numpy as np import pandas as pd from keras.preprocessing import sequence from keras.models import Sequential from keras.layers import Dense, Dropout, Activation from keras.layers import Embedding from keras.layers import GRU from keras.callbacks import EarlyStopping from keras.datasets import imdbimport numpy as np import pandas as pd from keras.preprocessing import sequence from keras.models import Sequential from...