Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Deep Learning with TensorFlow

You're reading from   Hands-On Deep Learning with TensorFlow Uncover what is underneath your data!

Arrow left icon
Product type Paperback
Published in Jul 2017
Publisher Packt
ISBN-13 9781787282773
Length 174 pages
Edition 1st Edition
Languages
Arrow right icon
Author (1):
Arrow left icon
Dan Van Boxel Dan Van Boxel
Author Profile Icon Dan Van Boxel
Dan Van Boxel
Arrow right icon
View More author details
Toc

Pooling layer application

In this section, we're going to take a look at the TensorFlow function for max pooling, then we'll talk about transitioning from a pooling layer back to a fully connected layer. Finally, we'll visually look at the pooling output to verify its reduced size.

Let's pick up in our example from where we left off in the previous section. Make sure you've executed everything up to the pound pooling layer before starting this exercise.

Recall we've put a 10x10 image through a 3x3 convolution and rectified linear activation. Now, let's add a 2x2 max pooling layer that comes after our convolutional layer.

p1 = tf.nn.max_pool(h1, ksize=[1, 2, 2, 1],
          strides=[1, 2, 2, 1], padding='VALID')

The key to this is tf.nn.max_pool. The first argument is just the output of our previous convolutional layer, h1. Next we have the strange ksize. This really just defines the window size of our pooling. In this case, 2x2. The first 1 refers...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at R$50/month. Cancel anytime