Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
IPython Interactive Computing and Visualization Cookbook

You're reading from   IPython Interactive Computing and Visualization Cookbook Harness IPython for powerful scientific computing and Python data visualization with this collection of more than 100 practical data science recipes

Arrow left icon
Product type Paperback
Published in Sep 2014
Publisher
ISBN-13 9781783284818
Length 512 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Cyrille Rossant Cyrille Rossant
Author Profile Icon Cyrille Rossant
Cyrille Rossant
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Preface 1. A Tour of Interactive Computing with IPython FREE CHAPTER 2. Best Practices in Interactive Computing 3. Mastering the Notebook 4. Profiling and Optimization 5. High-performance Computing 6. Advanced Visualization 7. Statistical Data Analysis 8. Machine Learning 9. Numerical Optimization 10. Signal Processing 11. Image and Audio Processing 12. Deterministic Dynamical Systems 13. Stochastic Dynamical Systems 14. Graphs, Geometry, and Geographic Information Systems 15. Symbolic and Numerical Mathematics Index

Fitting a Bayesian model by sampling from a posterior distribution with a Markov chain Monte Carlo method


In this recipe, we illustrate a very common and useful method for characterizing a posterior distribution in a Bayesian model. Imagine that you have some data and you want to obtain information about the underlying random phenomenon. In a frequentist approach, you could try to fit a probability distribution within a given family of distributions, using a parametric method such as the maximum likelihood method. The optimization procedure would yield parameters that maximize the probability of observing the data if given the null hypothesis.

In a Bayesian approach, you consider the parameters themselves as random variables. Their prior distributions reflect your initial knowledge about these parameters. After the observations, your knowledge is updated, and this is reflected in the posterior distributions of the parameters.

A typical goal for Bayesian inference is to characterize the posterior...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime