Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Mastering PyTorch

You're reading from   Mastering PyTorch Build powerful neural network architectures using advanced PyTorch 1.x features

Arrow left icon
Product type Paperback
Published in Feb 2021
Publisher Packt
ISBN-13 9781789614381
Length 450 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Ashish Ranjan Jha Ashish Ranjan Jha
Author Profile Icon Ashish Ranjan Jha
Ashish Ranjan Jha
Arrow right icon
View More author details
Toc

Table of Contents (20) Chapters Close

Preface 1. Section 1: PyTorch Overview
2. Chapter 1: Overview of Deep Learning using PyTorch FREE CHAPTER 3. Chapter 2: Combining CNNs and LSTMs 4. Section 2: Working with Advanced Neural Network Architectures
5. Chapter 3: Deep CNN Architectures 6. Chapter 4: Deep Recurrent Model Architectures 7. Chapter 5: Hybrid Advanced Models 8. Section 3: Generative Models and Deep Reinforcement Learning
9. Chapter 6: Music and Text Generation with PyTorch 10. Chapter 7: Neural Style Transfer 11. Chapter 8: Deep Convolutional GANs 12. Chapter 9: Deep Reinforcement Learning 13. Section 4: PyTorch in Production Systems
14. Chapter 10: Operationalizing PyTorch Models into Production 15. Chapter 11: Distributed Training 16. Chapter 12: PyTorch and AutoML 17. Chapter 13: PyTorch and Explainable AI 18. Chapter 14: Rapid Prototyping with PyTorch 19. Other Books You May Enjoy

Fine-tuning the AlexNet model

In this section, we will first take a quick look at the AlexNet architecture and how to build one by using PyTorch. Then we will explore PyTorch's pre-trained CNN models repository, and finally, use a pre-trained AlexNet model for fine-tuning on an image classification task, as well as making predictions.

AlexNet is a successor of LeNet with incremental changes in the architecture, such as 8 layers (5 convolutional and 3 fully connected) instead of 5, and 60 million model parameters instead of 60,000, as well as using MaxPool instead of AvgPool. Moreover, AlexNet was trained and tested on a much bigger dataset – ImageNet, which is over 100 GB in size, as opposed to the MNIST dataset (on which LeNet was trained), which amounts to a few MBs. AlexNet truly revolutionized CNNs as it emerged as a significantly more powerful class of models on image-related tasks than the other classical machine learning models, such as SVMs. Figure 3.14 shows...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at ₹800/month. Cancel anytime