Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Python Machine Learning By Example

You're reading from   Python Machine Learning By Example Unlock machine learning best practices with real-world use cases

Arrow left icon
Product type Paperback
Published in Jul 2024
Publisher Packt
ISBN-13 9781835085622
Length 518 pages
Edition 4th Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Yuxi (Hayden) Liu Yuxi (Hayden) Liu
Author Profile Icon Yuxi (Hayden) Liu
Yuxi (Hayden) Liu
Arrow right icon
View More author details
Toc

Table of Contents (18) Chapters Close

Preface 1. Getting Started with Machine Learning and Python 2. Building a Movie Recommendation Engine with Naïve Bayes FREE CHAPTER 3. Predicting Online Ad Click-Through with Tree-Based Algorithms 4. Predicting Online Ad Click-Through with Logistic Regression 5. Predicting Stock Prices with Regression Algorithms 6. Predicting Stock Prices with Artificial Neural Networks 7. Mining the 20 Newsgroups Dataset with Text Analysis Techniques 8. Discovering Underlying Topics in the Newsgroups Dataset with Clustering and Topic Modeling 9. Recognizing Faces with Support Vector Machine 10. Machine Learning Best Practices 11. Categorizing Images of Clothing with Convolutional Neural Networks 12. Making Predictions with Sequences Using Recurrent Neural Networks 13. Advancing Language Understanding and Generation with the Transformer Models 14. Building an Image Search Engine Using CLIP: a Multimodal Approach 15. Making Decisions in Complex Environments with Reinforcement Learning 16. Other Books You May Enjoy
17. Index

Estimating with support vector regression

As the name implies, SVR is part of the support vector family and a sibling of the Support Vector Machine (SVM) for classification (or we can just call it SVC).

To recap, SVC seeks an optimal hyperplane that best segregates observations from different classes. In SVR, our goal is to find a decision hyperplane (defined by a slope vector w and intercept b) so that two hyperplanes (negative hyperplane) and (positive hyperplane) can cover the bands of the optimal hyperplane. Simultaneously, the optimal hyperplane is as flat as possible, which means w is as small as possible, as shown in the following diagram:

Figure 9.13: Finding the decision hyperplane in SVR

This translates into deriving the optimal w and b by solving the following optimization problem:

  • Minimizing
  • Subject to , given a training set of , , … …,

The theory behind SVR is very similar to SVM. In the next section, let...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime