Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Mathematics for Deep Learning

You're reading from   Hands-On Mathematics for Deep Learning Build a solid mathematical foundation for training efficient deep neural networks

Arrow left icon
Product type Paperback
Published in Jun 2020
Publisher Packt
ISBN-13 9781838647292
Length 364 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Jay Dawani Jay Dawani
Author Profile Icon Jay Dawani
Jay Dawani
Arrow right icon
View More author details
Toc

Table of Contents (19) Chapters Close

Preface 1. Section 1: Essential Mathematics for Deep Learning
2. Linear Algebra FREE CHAPTER 3. Vector Calculus 4. Probability and Statistics 5. Optimization 6. Graph Theory 7. Section 2: Essential Neural Networks
8. Linear Neural Networks 9. Feedforward Neural Networks 10. Regularization 11. Convolutional Neural Networks 12. Recurrent Neural Networks 13. Section 3: Advanced Deep Learning Concepts Simplified
14. Attention Mechanisms 15. Generative Models 16. Transfer and Meta Learning 17. Geometric Deep Learning 18. Other Books You May Enjoy

Preface

Most programmers and data scientists struggle with mathematics, either having overlooked or forgotten core mathematical concepts. This book helps you understand the math that's required to understand how various neural networks work so that you can go on to building better deep learning (DL) models.

You'll begin by learning about the core mathematical and modern computational techniques used to design and implement DL algorithms. This book will cover essential topics, such as linear algebra, eigenvalues and eigenvectors, the singular value decomposition concept, and gradient algorithms, to help you understand how to train deep neural networks. Later chapters focus on important neural networks, such as the linear neural network, multilayer perceptrons, and radial basis function networks, with a primary focus on helping you learn how each model works. As you advance, you will delve into the math used for normalization, multi-layered DL, forward propagation, optimization, and backpropagation techniques to understand what it takes to build full-fledged DL models. Finally, you'll explore convolutional neural network (CNN), recurrent neural network (RNN), and generative adversarial network (GAN) models and their implementation.

By the end of this book, you'll have built a strong foundation in neural networks and DL mathematical concepts, which will help you to confidently research and build custom DL models.

lock icon The rest of the chapter is locked
Next Section arrow right
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime