Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Artificial Intelligence for Beginners

You're reading from   Hands-On Artificial Intelligence for Beginners An introduction to AI concepts, algorithms, and their implementation

Arrow left icon
Product type Paperback
Published in Oct 2018
Publisher Packt
ISBN-13 9781788991063
Length 362 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (2):
Arrow left icon
David Dindi David Dindi
Author Profile Icon David Dindi
David Dindi
Patrick D. Smith Patrick D. Smith
Author Profile Icon Patrick D. Smith
Patrick D. Smith
Arrow right icon
View More author details
Toc

Table of Contents (15) Chapters Close

Preface 1. The History of AI 2. Machine Learning Basics FREE CHAPTER 3. Platforms and Other Essentials 4. Your First Artificial Neural Networks 5. Convolutional Neural Networks 6. Recurrent Neural Networks 7. Generative Models 8. Reinforcement Learning 9. Deep Learning for Intelligent Agents 10. Deep Learning for Game Playing 11. Deep Learning for Finance 12. Deep Learning for Robotics 13. Deploying and Maintaining AI Applications 14. Other Books You May Enjoy

Summary

RNNs are the primary means by which we reason over textual inputs, and come in a variety of forms. In this chapter, we learned about the recurrent structure of RNNs, and special versions of RNNs that utilize memory cells. RNNs are used for any type of sequence prediction, generating music, text, image captions, and more.

RNNs are different from feedforward networks in that they have recurrence; each step in the RNN is dependent on the network's memory at the last state, along with its own weights and bias factors. As a result, vanilla RNNs struggle with long-term dependencies; they find it difficult to remember sequences beyond a specific number of time steps back. GRU and LSTM utilize memory gating mechanisms to control what they remember and forget, and hence, overcome the problem of dealing with long-term dependencies that many RNNs run into. RNN/CNN hybrids with...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image