Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Mastering PyTorch

You're reading from   Mastering PyTorch Build powerful neural network architectures using advanced PyTorch 1.x features

Arrow left icon
Product type Paperback
Published in Feb 2021
Publisher Packt
ISBN-13 9781789614381
Length 450 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Ashish Ranjan Jha Ashish Ranjan Jha
Author Profile Icon Ashish Ranjan Jha
Ashish Ranjan Jha
Arrow right icon
View More author details
Toc

Table of Contents (20) Chapters Close

Preface 1. Section 1: PyTorch Overview
2. Chapter 1: Overview of Deep Learning using PyTorch FREE CHAPTER 3. Chapter 2: Combining CNNs and LSTMs 4. Section 2: Working with Advanced Neural Network Architectures
5. Chapter 3: Deep CNN Architectures 6. Chapter 4: Deep Recurrent Model Architectures 7. Chapter 5: Hybrid Advanced Models 8. Section 3: Generative Models and Deep Reinforcement Learning
9. Chapter 6: Music and Text Generation with PyTorch 10. Chapter 7: Neural Style Transfer 11. Chapter 8: Deep Convolutional GANs 12. Chapter 9: Deep Reinforcement Learning 13. Section 4: PyTorch in Production Systems
14. Chapter 10: Operationalizing PyTorch Models into Production 15. Chapter 11: Distributed Training 16. Chapter 12: PyTorch and AutoML 17. Chapter 13: PyTorch and Explainable AI 18. Chapter 14: Rapid Prototyping with PyTorch 19. Other Books You May Enjoy

Chapter 2: Combining CNNs and LSTMs

Convolutional Neural Networks (CNNs) are a type of deep learning model known to solve machine learning problems related to images and video, such as image classification, object detection, segmentation, and more. This is because CNNs use a special type of layer called convolutional layers, which have shared learnable parameters. The weight or parameter sharing works because the patterns to be learned in an image (such as edges or contours) are assumed to be independent of the location of the pixels in the image. Just as CNNs are applied to images, Long Short-Term Memory (LSTM) networks – which are a type of Recurrent Neural Network (RNN) – prove to be extremely effective at solving machine learning problems related to sequential data. An example of sequential data could be text. For example, in a sentence, each word is dependent on the previous word(s). LSTM models are meant to model such sequential dependencies.

These two different...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at R$50/month. Cancel anytime