Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Deep Learning for Time Series Cookbook

You're reading from   Deep Learning for Time Series Cookbook Use PyTorch and Python recipes for forecasting, classification, and anomaly detection

Arrow left icon
Product type Paperback
Published in Mar 2024
Publisher Packt
ISBN-13 9781805129233
Length 274 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (2):
Arrow left icon
Luís Roque Luís Roque
Author Profile Icon Luís Roque
Luís Roque
Vitor Cerqueira Vitor Cerqueira
Author Profile Icon Vitor Cerqueira
Vitor Cerqueira
Arrow right icon
View More author details
Toc

Table of Contents (12) Chapters Close

Preface 1. Chapter 1: Getting Started with Time Series 2. Chapter 2: Getting Started with PyTorch FREE CHAPTER 3. Chapter 3: Univariate Time Series Forecasting 4. Chapter 4: Forecasting with PyTorch Lightning 5. Chapter 5: Global Forecasting Models 6. Chapter 6: Advanced Deep Learning Architectures for Time Series Forecasting 7. Chapter 7: Probabilistic Time Series Forecasting 8. Chapter 8: Deep Learning for Time Series Classification 9. Chapter 9: Deep Learning for Time Series Anomaly Detection 10. Index 11. Other Books You May Enjoy

Training an LSTM neural network

RNNs suffer from a fundamental problem of “vanishing gradients” where, due to the nature of backpropagation in neural networks, the influence of earlier inputs on the overall error diminishes drastically as the sequence gets longer. This is especially problematic in sequence processing tasks where long-term dependencies exist (i.e., future outputs depend on much earlier inputs).

Getting ready

LSTM networks were introduced to overcome this problem. They use a more complex internal structure for each of their cells compared to RNNs. Specifically, an LSTM has the ability to decide which information to discard or to store based on an internal structure called a cell. This cell uses gates (input, forget, and output gates) to control the flow of information into and out of the cell. This helps maintain and manipulate the “long-term” information, thereby mitigating the vanishing gradient problem.

How to do it…

...
You have been reading a chapter from
Deep Learning for Time Series Cookbook
Published in: Mar 2024
Publisher: Packt
ISBN-13: 9781805129233
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime