Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Machine Learning for OpenCV

You're reading from   Machine Learning for OpenCV Intelligent image processing with Python

Arrow left icon
Product type Paperback
Published in Jul 2017
Publisher Packt
ISBN-13 9781783980284
Length 382 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (2):
Arrow left icon
Michael Beyeler Michael Beyeler
Author Profile Icon Michael Beyeler
Michael Beyeler
Michael Beyeler (USD) Michael Beyeler (USD)
Author Profile Icon Michael Beyeler (USD)
Michael Beyeler (USD)
Arrow right icon
View More author details
Toc

Table of Contents (13) Chapters Close

Preface 1. A Taste of Machine Learning 2. Working with Data in OpenCV and Python FREE CHAPTER 3. First Steps in Supervised Learning 4. Representing Data and Engineering Features 5. Using Decision Trees to Make a Medical Diagnosis 6. Detecting Pedestrians with Support Vector Machines 7. Implementing a Spam Filter with Bayesian Learning 8. Discovering Hidden Structures with Unsupervised Learning 9. Using Deep Learning to Classify Handwritten Digits 10. Combining Different Algorithms into an Ensemble 11. Selecting the Right Model with Hyperparameter Tuning 12. Wrapping Up

Using decision trees for regression

Although we have so far focused on using decision trees in classification tasks, you can also use them for regression. But you will need to use scikit-learn again, as OpenCV does not provide this flexibility. We therefore only briefly review its functionality here.

Let's say we wanted to use a decision tree to fit a sin wave. To make things interesting, we will also add some noise to the data points using NumPy's random number generator:

In [1]: import numpy as np
... rng = np.random.RandomState(42)

We then create 100 x values between 0 and 5, and calculate the corresponding sin values:

In [2]: X = np.sort(5 * rng.rand(100, 1), axis=0)
... y = np.sin(X).ravel()

We then add noise to every other data point in y (using y[::2]), scaled by 0.5 so we don't introduce too much jitter:

In [3]: y[::2] += 0.5 * (0.5 - rng.rand(50...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime