Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Python Deep Learning

You're reading from   Python Deep Learning Next generation techniques to revolutionize computer vision, AI, speech and data analysis

Arrow left icon
Product type Paperback
Published in Apr 2017
Publisher Packt
ISBN-13 9781786464453
Length 406 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (4):
Arrow left icon
Peter Roelants Peter Roelants
Author Profile Icon Peter Roelants
Peter Roelants
Daniel Slater Daniel Slater
Author Profile Icon Daniel Slater
Daniel Slater
Valentino Zocca Valentino Zocca
Author Profile Icon Valentino Zocca
Valentino Zocca
Gianmario Spacagna Gianmario Spacagna
Author Profile Icon Gianmario Spacagna
Gianmario Spacagna
Arrow right icon
View More author details
Toc

Table of Contents (12) Chapters Close

Preface 1. Machine Learning – An Introduction FREE CHAPTER 2. Neural Networks 3. Deep Learning Fundamentals 4. Unsupervised Feature Learning 5. Image Recognition 6. Recurrent Neural Networks and Language Models 7. Deep Learning for Board Games 8. Deep Learning for Computer Games 9. Anomaly Detection 10. Building a Production-Ready Intrusion Detection System Index

Recurrent neural networks


RNNs get their name because they recurrently apply the same function over a sequence. An RNN can be written as a recurrence relation defined by this function:

St = f(St-1, Xt)

Here St —the state at step t—is computed by the function f from the state in the previous step, that is t-1, and an input Xt at the current step. This recurrence relation defines how the state evolves step by step over the sequence via a feedback loop over previous states, as illustrated in the following figure:

Figure from [3]

Left: Visual illustration of the RNN recurrence relation: S t = S t-1 * W + X t * U. The final output will be o t = V*S t

Right: RNN states recurrently unfolded over the sequence t- 1, t, t+1. Note that the parameters U, V, and W are shared between all the steps.

Here f can be any differentiable function. For example, a basic RNN is defined by the following recurrence relation:

St = tanh(St-1 * W + Xt * U)

Here W defines a linear transformation from state to state...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image