Further reading
You can check out the following resources for further reading:
- Fast ES-RNN: A GPU Implementation of the ES-RNN Algorithm: https://arxiv.org/abs/1907.03329 and https://github.com/damitkwr/ESRNN-GPU
- Functions as Vector Spaces: https://www.youtube.com/watch?v=NvEZol2Q8rs
- Forecast with N-BEATS, by Gaetan Dubuc: https://www.kaggle.com/code/gatandubuc/forecast-with-n-beats-interpretable-model/notebook
- WaveNet: A Generative Model for Audio, by DeepMind: https://www.deepmind.com/blog/wavenet-a-generative-model-for-raw-audio
- What is Residual Connection?, by Wanshun Wong: https://towardsdatascience.com/what-is-residual-connection-efb07cab0d55
- Efficient Transformers: A Survey, by Tay et al.: https://arxiv.org/abs/2009.06732
- Autocorrelation and the Wiener-Khinchin theorem: https://www.itp.tu-berlin.de/fileadmin/a3233/grk/pototskyLectures2012/pototsky_lectures_part1.pdf
- Modelling Long- and Short-Term Temporal Patterns with Deep Neural Networks...