Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Deep Learning with TensorFlow

You're reading from   Deep Learning with TensorFlow Explore neural networks with Python

Arrow left icon
Product type Paperback
Published in Apr 2017
Publisher Packt
ISBN-13 9781786469786
Length 320 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (4):
Arrow left icon
Md. Rezaul Karim Md. Rezaul Karim
Author Profile Icon Md. Rezaul Karim
Md. Rezaul Karim
Ahmed Menshawy Ahmed Menshawy
Author Profile Icon Ahmed Menshawy
Ahmed Menshawy
Giancarlo Zaccone Giancarlo Zaccone
Author Profile Icon Giancarlo Zaccone
Giancarlo Zaccone
Fabrizio Milo Fabrizio Milo
Author Profile Icon Fabrizio Milo
Fabrizio Milo
Arrow right icon
View More author details
Toc

Table of Contents (11) Chapters Close

Preface 1. Getting Started with Deep Learning FREE CHAPTER 2. First Look at TensorFlow 3. Using TensorFlow on a Feed-Forward Neural Network 4. TensorFlow on a Convolutional Neural Network 5. Optimizing TensorFlow Autoencoders 6. Recurrent Neural Networks 7. GPU Computing 8. Advanced TensorFlow Programming 9. Advanced Multimedia Programming with TensorFlow 10. Reinforcement Learning

ReLU classifier

The last architectural change improved the accuracy of our model, but we can do even better by changing the sigmoid activation function with the Rectified Linear Unit, shown as follows:

ReLU function

A Rectified Linear Unit (ReLU) unit computes the function f(x) = max(0, x), ReLU is computationally fast because it does not require any exponential computation, such as those required in sigmoid or tanh activations, furthermore it was found to greatly accelerate the convergence of stochastic gradient descent compared to the sigmoid/tanh functions.

To use the ReLU function, we simply change, in the previously implemented model, the following definitions of the first four layers, in the previously implemented model.

First layer output:

Y1 = tf.nn.relu(tf.matmul(XX, W1) + B1)  

Second layer output:

Y2 = tf.nn.relu(tf.matmul(Y1, W2) + B2) 

Third layer output:

Y3 = tf.nn.relu(tf.matmul(Y2, W3) + B3) ...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image