In this chapter, we presented the specialized RNN architecture, which is tailored to sequential data. We covered how RNNs work, analyzed a computational graph, and saw how RNNs enable parameter sharing over numerous steps to capture long-range dependencies that FFNNs and CNNs are not well suited for.
We also reviewed the challenges of vanishing and exploding gradients and saw how gate units such as LSTM cells enable RNNs to learn dependencies over hundreds of time steps. Finally, we applied RNN models to challenges common in algorithmic trading, such as predicting univariate and multivariate time series and sentiment analysis.
In the next chapter, we will introduce unsupervised deep learning techniques, such as autoencoders and Generative Adversarial Networks (GANs), and their applications in investment and trading strategies.