Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Machine Learning for OpenCV 4

You're reading from   Machine Learning for OpenCV 4 Intelligent algorithms for building image processing apps using OpenCV 4, Python, and scikit-learn

Arrow left icon
Product type Paperback
Published in Sep 2019
Publisher Packt
ISBN-13 9781789536300
Length 420 pages
Edition 2nd Edition
Languages
Tools
Arrow right icon
Authors (4):
Arrow left icon
Aditya Sharma Aditya Sharma
Author Profile Icon Aditya Sharma
Aditya Sharma
Michael Beyeler (USD) Michael Beyeler (USD)
Author Profile Icon Michael Beyeler (USD)
Michael Beyeler (USD)
Vishwesh Ravi Shrimali Vishwesh Ravi Shrimali
Author Profile Icon Vishwesh Ravi Shrimali
Vishwesh Ravi Shrimali
Michael Beyeler Michael Beyeler
Author Profile Icon Michael Beyeler
Michael Beyeler
Arrow right icon
View More author details
Toc

Table of Contents (18) Chapters Close

Preface 1. Section 1: Fundamentals of Machine Learning and OpenCV FREE CHAPTER
2. A Taste of Machine Learning 3. Working with Data in OpenCV 4. First Steps in Supervised Learning 5. Representing Data and Engineering Features 6. Section 2: Operations with OpenCV
7. Using Decision Trees to Make a Medical Diagnosis 8. Detecting Pedestrians with Support Vector Machines 9. Implementing a Spam Filter with Bayesian Learning 10. Discovering Hidden Structures with Unsupervised Learning 11. Section 3: Advanced Machine Learning with OpenCV
12. Using Deep Learning to Classify Handwritten Digits 13. Ensemble Methods for Classification 14. Selecting the Right Model with Hyperparameter Tuning 15. Using OpenVINO with OpenCV 16. Conclusion 17. Other Books You May Enjoy

Summary

In this chapter, we covered quite a lot of ground, didn't we?

In short, we learned a lot about different supervised learning algorithms, how to apply them to real datasets, and how to implement everything in OpenCV. We introduced classification algorithms such as k-NN and logistic regression and discussed how they could be used to predict labels as two or more discrete categories. We introduced various variants of linear regression (such as Lasso regression and ridge regression) and discussed how they could be used to predict continuous variables. Last but not least, we got acquainted with the Iris and Boston datasets, two classics in the history of machine learning.

In the following chapters, we will go into much greater depth within these topics and explore some more interesting examples of where these concepts can be useful.

But first, we need to talk about another...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime