Summary
In this chapter, I've introduced many deep learning concepts relevant to time-series, and we've discussed many architectures and algorithms, such as autoencoders, InceptionTime, DeepAR, N-BEATS, ConvNets, and a few transformer architectures. Deep learning algorithms are indeed coming very close to the state of the art in time-series, and it's an exciting area of research and application.
In the practice section, I implemented a fully connected feedforward network and then an RNN before taking a causal ConvNet for a ride.
In Chapter 12, Multivariate Forecasting, we'll do some more deep learning, including a Transformer model and an LSTM.