In this chapter, we demonstrated how to replicate any function, as RNNs are Turing complete. In particular, we explored how to solve time-dependent series data or time series data.
In particular, we learned how to implement an LSTM and its architecture. We learned about its ability to capture both long- and short-term dependencies. LSTM has a chain-like structure, which is similar to a simple RNN; however, instead of one, it has four neural network layers. These layers form a gate that allows the network to add or remove information if certain conditions are met.
Additionally, we learned how to implement an RNN using keras. We also introduced another tool, which is particularly useful for complex tasks, such as NLP with PyTorch. PyTorch allows you to compute the execution graph dynamically, which is particularly useful for tasks that have variable data.
In the next chapter...