Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Python Machine Learning By Example

You're reading from   Python Machine Learning By Example Implement machine learning algorithms and techniques to build intelligent systems

Arrow left icon
Product type Paperback
Published in Feb 2019
Publisher Packt
ISBN-13 9781789616729
Length 382 pages
Edition 2nd Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Yuxi (Hayden) Liu Yuxi (Hayden) Liu
Author Profile Icon Yuxi (Hayden) Liu
Yuxi (Hayden) Liu
Arrow right icon
View More author details
Toc

Table of Contents (15) Chapters Close

Preface 1. Section 1: Fundamentals of Machine Learning FREE CHAPTER
2. Getting Started with Machine Learning and Python 3. Section 2: Practical Python Machine Learning By Example
4. Exploring the 20 Newsgroups Dataset with Text Analysis Techniques 5. Mining the 20 Newsgroups Dataset with Clustering and Topic Modeling Algorithms 6. Detecting Spam Email with Naive Bayes 7. Classifying Newsgroup Topics with Support Vector Machines 8. Predicting Online Ad Click-Through with Tree-Based Algorithms 9. Predicting Online Ad Click-Through with Logistic Regression 10. Scaling Up Prediction to Terabyte Click Logs 11. Stock Price Prediction with Regression Algorithms 12. Section 3: Python Machine Learning Best Practices
13. Machine Learning Best Practices 14. Other Books You May Enjoy

Handling multiclass classification

One last thing worth noting is how logistic regression algorithms deal with multiclass classification. Although we interact with the scikit-learn classifiers in multiclass cases the same way as in binary cases, it is encouraging to understand how logistic regression works in multiclass classification.

Logistic regression for more than two classes is also called multinomial logistic regression, or better known latterly as softmax regression. As we have seen in the binary case, the model is represented by one weight vector w, the probability of the target being 1 or the positive class is written as follows:

In the K class case, the model is represented by K weight vectors, w1, w2, …, wK, and the probability of the target being class k is written as follows:

Note that the term normalizes probabilities (k from 1 to K) so that they total...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime