Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
The Artificial Intelligence Infrastructure Workshop

You're reading from   The Artificial Intelligence Infrastructure Workshop Build your own highly scalable and robust data storage systems that can support a variety of cutting-edge AI applications

Arrow left icon
Product type Paperback
Published in Aug 2020
Publisher Packt
ISBN-13 9781800209848
Length 732 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (6):
Arrow left icon
Bas Geerdink Bas Geerdink
Author Profile Icon Bas Geerdink
Bas Geerdink
Chinmay Arankalle Chinmay Arankalle
Author Profile Icon Chinmay Arankalle
Chinmay Arankalle
Kunal Gera Kunal Gera
Author Profile Icon Kunal Gera
Kunal Gera
Kevin Liao Kevin Liao
Author Profile Icon Kevin Liao
Kevin Liao
Gareth Dwyer Gareth Dwyer
Author Profile Icon Gareth Dwyer
Gareth Dwyer
Anand N.S. Anand N.S.
Author Profile Icon Anand N.S.
Anand N.S.
+2 more Show less
Arrow right icon
View More author details
Toc

Table of Contents (14) Chapters Close

Preface
1. Data Storage Fundamentals 2. Artificial Intelligence Storage Requirements FREE CHAPTER 3. Data Preparation 4. The Ethics of AI Data Storage 5. Data Stores: SQL and NoSQL Databases 6. Big Data File Formats 7. Introduction to Analytics Engine (Spark) for Big Data 8. Data System Design Examples 9. Workflow Management for AI 10. Introduction to Data Storage on Cloud Services (AWS) 11. Building an Artificial Intelligence Algorithm 12. Productionizing Your AI Applications Appendix

Mini-Batch SGD with PyTorch

Let's recap what we have learned so far. We started by implementing a gradient descent algorithm in NumPy. Then we were introduced to PyTorch, a modern deep learning library. We implemented an improved version of the gradient descent algorithm in PyTorch in the last exercise. Now let's dig into more details about gradient descent.

There are three types of gradient descent algorithms:

  • Batch gradient descent
  • Stochastic gradient descent
  • Mini-batch stochastic gradient descent

While batch gradient descent computes model parameter' gradients using the entire dataset, stochastic gradient descent computes model parameter' gradients using a single sample in the dataset. But using a single sample to compute gradients is very unreliable and the estimated gradients are extremely noisy. So, most applications of stochastic gradient descent use more than one sample, or a mini-batch of a handful of samples, to compute gradients...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime