Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
TensorFlow Machine Learning Projects

You're reading from   TensorFlow Machine Learning Projects Build 13 real-world projects with advanced numerical computations using the Python ecosystem

Arrow left icon
Product type Paperback
Published in Nov 2018
Publisher Packt
ISBN-13 9781789132212
Length 322 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (2):
Arrow left icon
Ankit Jain Ankit Jain
Author Profile Icon Ankit Jain
Ankit Jain
Dr. Amita Kapoor Dr. Amita Kapoor
Author Profile Icon Dr. Amita Kapoor
Dr. Amita Kapoor
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Preface 1. Overview of TensorFlow and Machine Learning FREE CHAPTER 2. Using Machine Learning to Detect Exoplanets in Outer Space 3. Sentiment Analysis in Your Browser Using TensorFlow.js 4. Digit Classification Using TensorFlow Lite 5. Speech to Text and Topic Extraction Using NLP 6. Predicting Stock Prices using Gaussian Process Regression 7. Credit Card Fraud Detection using Autoencoders 8. Generating Uncertainty in Traffic Signs Classifier Using Bayesian Neural Networks 9. Generating Matching Shoe Bags from Shoe Images Using DiscoGANs 10. Classifying Clothing Images using Capsule Networks 11. Making Quality Product Recommendations Using TensorFlow 12. Object Detection at a Large Scale with TensorFlow 13. Generating Book Scripts Using LSTMs 14. Playing Pacman Using Deep Reinforcement Learning 15. What is Next? 16. Other Books You May Enjoy

Understanding recurrent neural networks

Recurrent neural networks (RNNs) have become extremely popular for any task that involves sequential data. The core idea behind RNNs is to exploit the sequential information present in the data. Under usual circumstances, every neural network assumes that all of the inputs are independent of each other. However, if we are trying to predict the next word in a sequence or the next point in a time series, it is imperative to use information based on the words used prior or on the historical points in the time series.

One way to perceive the concept of RNNs is that they have a memory that stores information about historical data in a sequence. In theory, RNNs can remember history for arbitrarily long sequences, however, in practice, they do a bad job in tasks where the historical information needs to be retained for more than...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime