Search icon CANCEL
Subscription
0
Cart icon
Cart
Close icon
You have no products in your basket yet
Save more on your purchases!
Savings automatically calculated. No voucher code required
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Practical Convolutional Neural Networks

You're reading from  Practical Convolutional Neural Networks

Product type Book
Published in Feb 2018
Publisher Packt
ISBN-13 9781788392303
Pages 218 pages
Edition 1st Edition
Languages
Authors (3):
Mohit Sewak Mohit Sewak
Profile icon Mohit Sewak
Md. Rezaul Karim Md. Rezaul Karim
Profile icon Md. Rezaul Karim
Pradeep Pujari Pradeep Pujari
Profile icon Pradeep Pujari
View More author details
Toc

Table of Contents (11) Chapters close

Preface 1. Deep Neural Networks – Overview 2. Introduction to Convolutional Neural Networks 3. Build Your First CNN and Performance Optimization 4. Popular CNN Model Architectures 5. Transfer Learning 6. Autoencoders for CNN 7. Object Detection and Instance Segmentation with CNN 8. GAN: Generating New Images with CNN 9. Attention Mechanism for CNN and Visual Models 10. Other Books You May Enjoy

Types of Attention


There are two types attention mechanisms. They are as follows:

  • Hard attention
  • Soft attention

Let's now take a look at each one in detail in the following sections.

Hard Attention

In reality, in our recent image caption example, several more pictures would be selected, but due to our training with the handwritten captions, those would never be weighted higher. However, the essential thing to understand is how the system would understand what all pixels (or more precisely, the CNN representations of them) the system focuses on to draw these high-resolution images of different aspects and then how to choose the next pixel to repeat the process.

In the preceding example, the points are chosen at random from a distribution and the process is repeated. Also, which pixels around this point get a higher resolution is decided inside the attention network. This type of attention is known as hard attention.

Hard attention has something called the differentiability problem. Let's spend some...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime