We just accomplished an important part of our learning journey regarding DL architectures—RNNs! In this chapter, we got more familiar with RNNs and their variants. We started with what RNNs are, the evolution paths of RNNs, and how they became the state-of-the-art solutions to sequential modeling. We also explored four RNN architectures categorized by the forms of input and output data, along with industrial examples.
We followed by discussing a variety of architectures categorized by the recurrent layer, including vanilla RNNs, LSTM, GRU, and bidirectional RNNs. First, we applied the vanilla architecture to write our own War and Peace, albeit a bit nonsensical. We produced a better version by using LSTM architecture RNNs. Another memory-boosted architecture, GRU, was employed in stock price prediction.
Finally, beyond past information, we introduced the bidirectional...