Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Deep Learning with PyTorch Quick Start Guide

You're reading from   Deep Learning with PyTorch Quick Start Guide Learn to train and deploy neural network models in Python

Arrow left icon
Product type Paperback
Published in Dec 2018
Publisher Packt
ISBN-13 9781789534092
Length 158 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
David Julian David Julian
Author Profile Icon David Julian
David Julian
Arrow right icon
View More author details
Toc

Optimization techniques

The torch.optim package contains a number of optimization algorithms, and each of these algorithms has several parameters that we can use to fine-tune deep learning models. Optimization is a critical component in deep learning, so it is no surprise that different optimization techniques can be key to a model's performance. Remember, its role is to store and update the parameter state based on the calculated gradients of the loss function.

Optimizer algorithms

There are a number of optimization algorithms besides SGD available in PyTorch. The following code shows one such algorithm:

optim.Adadelta(params, lr=1.0, rho=0.9, eps=1e-06, weight_decay=0)

The Adedelta algorithm is based on stochastic gradient...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image