Nonparametric Bayesian Methods: Models, Algorithms, and Applications
This tutorial took place as a primer as part of the Models, Inference, and Algorithms (MIA) series at the Broad Institute at MIT.
See this link for the latest versions and videos of this tutorial.
Wednesday, May 17, 2017
8:30–9:30 AM
Instructor:
Professor Tamara Broderick
Email:
Description
Nonparametric Bayesian methods make use of
infinite-dimensional mathematical structures to allow the practitioner
to learn more from their data as the size of their data set grows.
What does that mean, and how does it work in practice? In this
tutorial, we'll cover why machine learning and statistics need more
than just parametric Bayesian inference. We'll introduce and study
a foundational nonparametric Bayesian model known as the Dirichlet process.
Along the way, we'll see what
exactly nonparametric Bayesian methods are and what they accomplish.
Materials
- [Slides]
- README for demos
- Demo 1 [code]: Dirichlet random variable and random distribution intuition
- Demo 2 [code]: K large relative to N intuition; empty components
- Demo 3 [code]: K large relative to N intuition; growth of number of clusters
- Demo 4 [code]: GEM random distribution intuition
- Demo 5 [code]: An exact DPMM simulator