Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Bayesian Analysis with Python

You're reading from   Bayesian Analysis with Python A practical guide to probabilistic modeling

Arrow left icon
Product type Paperback
Published in Jan 2024
Publisher Packt
ISBN-13 9781805127161
Length 394 pages
Edition 3rd Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Osvaldo Martin Osvaldo Martin
Author Profile Icon Osvaldo Martin
Osvaldo Martin
Arrow right icon
View More author details
Toc

Table of Contents (15) Chapters Close

Preface
1. Chapter 1 Thinking Probabilistically 2. Chapter 2 Programming Probabilistically FREE CHAPTER 3. Chapter 3 Hierarchical Models 4. Chapter 4 Modeling with Lines 5. Chapter 5 Comparing Models 6. Chapter 6 Modeling with Bambi 7. Chapter 7 Mixture Models 8. Chapter 8 Gaussian Processes 9. Chapter 9 Bayesian Additive Regression Trees 10. Chapter 10 Inference Engines 11. Chapter 11 Where to Go Next 12. Bibliography
13. Other Books You May Enjoy
14. Index

7.2 Finite mixture models

One way to build mixture models is to consider a finite weighted mixture of two or more distributions. Then the probability density of the observed data is a weighted sum of the probability density of K subgroups:

 ∑K p(y) = wip (y | θi) i=1

We can interpret wi as the probability of the component i, and thus its values are restricted to the interval [0, 1] and they need to sum up to 1. The components p(y|θi) are usually simple distributions, such as a Gaussian or a Poisson. If K is finite, we have a finite mixture model. To fit such a model, we need to provide a value of K, either because we know the correct value beforehand or because we can make an educated guess.

Conceptually, to solve a mixture model, all we need to do is properly assign each data point to one of the components. In a probabilistic model, we can do this by introducing a random variable, whose function is to specify to which component a particular observation is assigned. This variable is generally referred...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime