Understanding advancements over the standard GRU and LSTM layers
GRU and LSTM are the most widely used RNN methods today, but one might wonder how to push the boundaries achievable by a standard GRU or a standard LSTM. One good start to building this intuition is to understand that both of the layer types are capable of accepting sequential data, and to build a network you need multiple RNN layers. This means that it is entirely possible to combine GRU and LSTM layers in the same network. This, however, is not credible enough to be considered an advancement as a fully LSTM network or a fully GRU network can exceed the performance of a combined LSTM and GRU network at any time. Let’s dive into another simple improvement you can make on top of these standard RNN layers, called bidirectional RNN.
Decoding bidirectional RNN
Both GRU and LSTM rely on the sequential nature of the data. This order of the sequence can be forward in increasing time steps and also can be backward...