Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Machine Learning Solutions

You're reading from   Machine Learning Solutions Expert techniques to tackle complex machine learning problems using Python

Arrow left icon
Product type Paperback
Published in Apr 2018
Publisher Packt
ISBN-13 9781788390040
Length 566 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Jalaj Thanaki Jalaj Thanaki
Author Profile Icon Jalaj Thanaki
Jalaj Thanaki
Arrow right icon
View More author details
Toc

Table of Contents (19) Chapters Close

Machine Learning Solutions
Foreword
Contributors
Preface
1. Credit Risk Modeling 2. Stock Market Price Prediction FREE CHAPTER 3. Customer Analytics 4. Recommendation Systems for E-Commerce 5. Sentiment Analysis 6. Job Recommendation Engine 7. Text Summarization 8. Developing Chatbots 9. Building a Real-Time Object Recognition App 10. Face Recognition and Face Emotion Recognition 11. Building Gaming Bot List of Cheat Sheets Strategy for Wining Hackathons Index

Problems with the existing approach


We got the baseline score using the AdaBoost and GradientBoosting classifiers. Now, we need to increase the accuracy of these classifiers. In order to do that, we first list all the areas that can be improvised but that we haven't worked upon extensively. We also need to list possible problems with the baseline approach. Once we have the list of the problems or the areas on which we need to work, it will be easy for us to implement the revised approach.

Here, I'm listing some of the areas, or problems, that we haven't worked on in our baseline iteration:

  • Problem: We haven't used cross-validation techniques extensively in order to check the overfitting issue.

    • Solution: If we use cross-validation techniques properly, then we will know whether our trained ML model suffers from overfitting or not. This will help us because we don't want to build a model that can't even be generalized properly.

  • Problem: We also haven't focused on hyperparameter tuning. In our baseline approach, we mostly use the default parameters. We define these parameters during the declaration of the classifier. You can refer to the code snippet given in Figure 1.52, where you can see the classifier taking some parameters that are used when it trains the model. We haven't changed these parameters.

    • Solution: We need to tune these hyperparameters in such a way that we can increase the accuracy of the classifier. There are various hyperparameter-tuning techniques that we need to use.

In the next section, we will look at how these optimization techniques actually work as well as discuss the approach that we are going to take. So let's begin!

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime