STATS305C: Applied Statistics III
Instructor: Scott Linderman
TA: Matt MacKay, James Yang
Term: Spring 2022
Stanford University
Course Description:
Probabilistic modeling and inference of multivariate data. Topics may include multivariate Gaussian models, probabilistic graphical models, MCMC and variational Bayesian inference, dimensionality reduction, principal components, factor analysis, matrix completion, topic modeling, and state space models. Extensive work with data involving programming, ideally in Python.
Prerequisites:
Students should be comfortable with probability and statistics as well as multivariate calculus and linear algebra. This course will emphasize implementing models and algorithms, so coding proficiency is required.
Logistics:
- Time: Monday and Wednesday, 11:30am-1pm
- Level: advanced undergrad and up
- Grading basis: credit or letter grade
- Office hours:
- Monday 1-2pm (Scott)
- Tuesday 5:30-7pm in Bowker, Room 207, Sequoia Hall and over Zoom (Matt)
- Friday 1-2:30pm Zoom (James)
- Final evaluation: Exam
Books
- Bishop. Pattern recognition and machine learning. New York: Springer, 2006. link
- Murphy. Probabilistic Machine Learning: Advanced Topics. MIT Press, 2023. link
- Gelman et al. Bayesian Data Analysis. Chapman and Hall, 2005. link
Assignments
- Assignment 1: Bayesian Linear Regression. Due Weds. Apr 6 at 11:59pm on GradeScope.
- Assignment 2: Gibbs Sampling and Metropolis-Hastings. Due Weds. Apr 13 at 11:59pm on GradeScope.
- Assignment 3: Continuous Latent Variable Models. Due Weds. Apr 20 at 11:59pm on GradeScope.
- Assignment 4: Bayesian Mixture Models. Due Weds. Apr 27 at 11:59pm on GradeScope.
- Assignment 5: Poisson Matrix Factorization. Due Weds. May 4 at 11:59pm on GradeScope.
- Assignment 6: Neural Networks and VAEs. Due Weds. May 11 at 11:59pm on GradeScope.
- Assignment 7: Hidden Markov Models. Due Weds. May 18 at 11:59pm on GradeScope.
Schedule
Week 1 (3/28 & 3/30): Multivariate Normal Models and Conjugate Priors
- Required Reading: Bishop, Ch 2.3
- Optional Reading: Murphy, Ch 2.3 and 3.2.4
Week 2 (4/4 & 4/6): Hierarchical Models and Gibbs Sampling
- Required Reading: Bishop, Ch 8.1-8.2 and 11.2-11.3
- Optional Reading: Murphy, Ch 3.5.2, 4.2, and 11.1-11.3
- Optional Reading: Gelman, Ch 5
Week 3 (4/11 & 4/13): Continuous Latent Variable Models and HMC
- Required Reading: Bishop, Ch 12.1-12.2
- Required Reading: MCMC using Hamiltonian dynamics Neal, 2012
Week 4 (4/18 & 4/20): Mixture Models and EM
- Required Reading: Bishop, Ch 9
- Optional Reading: Murphy, Ch 6.7
Week 5 (4/25 & 4/27): Mixed Membership Models and Mean Field VI
- Required Reading: "Probabilistic topic models" Blei, 2012
- Required Reading: "Variational Inference: A Review for Statisticians” Blei et al, 2017
- Optional Reading: Murphy, Ch 10.2
Week 6 (5/2 & 5/4): Variational Autoencoders and Fixed-Form VI
- Required Reading: “An Introduction to Variational Autoencoders” (Ch 1 and 2) Kingma and Welling, 2019
- Optional Reading: Murphy, Ch 10.3
Week 7 (5/9 & 5/11): State Space Models and Message Passing
- Required Reading: Bishop, Ch 13
- Optional Reading: Murphy, Ch 8
Week 8 (5/16 & 5/18): Bayesian Nonparametrics and more MCMC
- Required Reading: Bishop, Ch 6.4
- Optional Reading: Kingman, 1993, Ch 1-2
- Optional Reading: Adams et al, 2019
Weeks 9 and 10: Research Topics in Probabilistic Machine Learning
- TBD