Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Deep Learning Quick Reference

You're reading from   Deep Learning Quick Reference Useful hacks for training and optimizing deep neural networks with TensorFlow and Keras

Arrow left icon
Product type Paperback
Published in Mar 2018
Publisher Packt
ISBN-13 9781788837996
Length 272 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Mike Bernico Mike Bernico
Author Profile Icon Mike Bernico
Mike Bernico
Arrow right icon
View More author details
Toc

Table of Contents (15) Chapters Close

Preface 1. The Building Blocks of Deep Learning FREE CHAPTER 2. Using Deep Learning to Solve Regression Problems 3. Monitoring Network Training Using TensorBoard 4. Using Deep Learning to Solve Binary Classification Problems 5. Using Keras to Solve Multiclass Classification Problems 6. Hyperparameter Optimization 7. Training a CNN from Scratch 8. Transfer Learning with Pretrained CNNs 9. Training an RNN from scratch 10. Training LSTMs with Word Embeddings from Scratch 11. Training Seq2Seq Models 12. Using Deep Reinforcement Learning 13. Generative Adversarial Networks 14. Other Books You May Enjoy

Introducing recurrent neural networks

In case the definition is unclear, let's look at an example: a stock market ticker where we might observe the price of a stock changing over time, such as Alphabet Inc. in the following screenshot, which is an example of time series:

In the next chapter, we will talk about using recurrent neural networks to model language, which is another type of sequence, a sequence of words. Since you're reading this book, you undoubtedly have some intuition on language sequences already.

If you're new to time series, you might be wondering if it would be possible to use a normal multilayer perceptron to solve a time series problem. You most certainly could do that; however, practically, you almost always get better results using recurrent networks. That said, recurrent neural networks have two other advantages for modeling sequences:

  • They...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image