Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
MATLAB for Machine Learning

You're reading from   MATLAB for Machine Learning Unlock the power of deep learning for swift and enhanced results

Arrow left icon
Product type Paperback
Published in Jan 2024
Publisher Packt
ISBN-13 9781835087695
Length 374 pages
Edition 2nd Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Giuseppe Ciaburro Giuseppe Ciaburro
Author Profile Icon Giuseppe Ciaburro
Giuseppe Ciaburro
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Preface 1. Part 1: Getting Started with Matlab
2. Chapter 1: Exploring MATLAB for Machine Learning FREE CHAPTER 3. Chapter 2: Working with Data in MATLAB 4. Part 2: Understanding Machine Learning Algorithms in MATLAB
5. Chapter 3: Prediction Using Classification and Regression 6. Chapter 4: Clustering Analysis and Dimensionality Reduction 7. Chapter 5: Introducing Artificial Neural Network Modeling 8. Chapter 6: Deep Learning and Convolutional Neural Networks 9. Part 3: Machine Learning in Practice
10. Chapter 7: Natural Language Processing Using MATLAB 11. Chapter 8: MATLAB for Image Processing and Computer Vision 12. Chapter 9: Time Series Analysis and Forecasting with MATLAB 13. Chapter 10: MATLAB Tools for Recommender Systems 14. Chapter 11: Anomaly Detection in MATLAB 15. Index 16. Other Books You May Enjoy

Understanding advanced regularization techniques

Advanced regularization techniques are methods used in ML and statistical modeling to prevent overfitting and improve the generalization performance of models. Overfitting occurs when a model fits the training data too closely, capturing noise and irrelevant patterns, which leads to poor performance on unseen data. Regularization techniques introduce constraints or penalties to the model’s parameters during training to encourage simpler, more generalized models.

Understanding dropout

Dropout is a regularization technique used in NNs, particularly deep NNs (DNNs), to prevent overfitting. Overfitting occurs when an NN learns to fit the training data too closely, capturing noise and memorizing specific examples rather than generalizing from the data. Dropout is a simple yet effective method for improving a model’s generalization performance.

During the training phase, at each forward and backward pass, dropout randomly...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime