Another type of unit often used in RNNs is gated recurrent units (GRUs). These units are actually simpler than LSTM units, because they only have two gates: update and reset. The update gate determines the memory and the reset gate combines the memory with the current input. The flow of data is made visual in the following figure:
Figure 4.3: Example flow in a GRU unit
In this recipe, we will show how to incorporate a GRU into an RNN architecture to classify text with Keras.