Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Machine Learning with C++

You're reading from   Hands-On Machine Learning with C++ Build, train, and deploy end-to-end machine learning and deep learning pipelines

Arrow left icon
Product type Paperback
Published in May 2020
Publisher Packt
ISBN-13 9781789955330
Length 530 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Kirill Kolodiazhnyi Kirill Kolodiazhnyi
Author Profile Icon Kirill Kolodiazhnyi
Kirill Kolodiazhnyi
Arrow right icon
View More author details
Toc

Table of Contents (19) Chapters Close

Preface 1. Section 1: Overview of Machine Learning
2. Introduction to Machine Learning with C++ FREE CHAPTER 3. Data Processing 4. Measuring Performance and Selecting Models 5. Section 2: Machine Learning Algorithms
6. Clustering 7. Anomaly Detection 8. Dimensionality Reduction 9. Classification 10. Recommender Systems 11. Ensemble Learning 12. Section 3: Advanced Examples
13. Neural Networks for Image Classification 14. Sentiment Analysis with Recurrent Neural Networks 15. Section 4: Production and Deployment Challenges
16. Exporting and Importing Models 17. Deploying Models on Mobile and Cloud Platforms 18. Other Books You May Enjoy

Training RNNs using the concept of backpropagation through time

At the time of writing, for training neural networks nearly everywhere, the error backpropagation algorithm is used. The result of performing inference on the training set of examples (in our case, the set of subsequences) is checked against the expected result (labeled data). The difference between the actual and expected values ​​is called an error. This error is propagated to the network weights in the opposite direction. Thus, the network adapts to labeled data, and the result of this adaptation works well for the data that the network did not meet in the initial training examples (generalization hypothesis).

In the case of a recurrent network, we have several options regarding which network outputs we can consider the error. This section describes the two main approaches: the first considers the...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at ₹800/month. Cancel anytime