In a bidirectional RNN, we have two different layers of hidden units. Both of these layers connect from the input layer to the output layer. In one layer, the hidden states are shared from left to right, and in the other layer, they are shared from right to left.
But what does this mean? To put it simply, one hidden layer moves forward through time from the start of the sequence, while the other hidden layer moves backward through time from the end of the sequence.
As shown in the following diagram, we have two hidden layers: a forward hidden layer and a backward hidden layer, which are described as follows:
- In the forward hidden layer, hidden state values are shared from past time steps, that is, is shared to , is shared to , and so on
- In the backward hidden layer, hidden start values are shared from future time steps, that is, to , to , and so on
The forward...