Description of Course

Probabilistic graphical modeling and inference is a powerful modern approach to representing the combined statistics of data and models, reasoning about the world in the face of uncertainty, and learning about it from data. It cleanly separates the notions of representation, reasoning, and learning. It provides a principled framework for combining multiple sources of information, such as prior knowledge about the world, with evidence about a particular case in observed data. This course will provide a solid introduction to the methodology and associated techniques, and show how they are applied in diverse domains ranging from computer vision to computational biology to computational neuroscience.


All reading material will be made available through presentation slides or the course webpage. Students will find the following optional textbooks useful throughout this course:

Bishop, C. "Pattern Recognition and Machine Learning." Springer, 2006 ( PDF )
Murphy, K. "Machine Learning: A Probabilistic Perspective." MIT press, 2012 ( UA Library )

Note that Bishop's book is freely available to anyone. Murphy's book is available online through the UA library with NetID login.

Instructor and Contact Information:

Instructor: Jason Pacheco, GS 724, Email:
Office Hours: Tuesday @ 3-4:30pm (Tucson time)
Optional Office Hours:Thursday @ 9-10:30am (Message professor on Piazza at least the day before)
Instructor Homepage:

Date Topic Readings Assignment
8/24 Introduction + Course Overview    (slides) Matlab Primer
8/26 Probability Primer (Part I)
(slides: 1-24)
Murphy: Secs. 2.1 and 2.2
8/31 Probability Primer (Part II)
(slides: 25-42)
Murphy: Secs. 2.3 - 2.5
Bishop: Secs. 2.1 - 2.3.2
HW1 (Due: 9/9)
9/2 Probability Primer (Part III)
(slides: 43-end)
Bayesian Probability and statistics (Part I)
Murphy: Secs. 3.1-3.2
Bishop: Sec. 1.2
9/7 Labor Day - No Class
9/9 Bayesian Probability and Statistics (Part II)
Murphy: Secs. 3.3-3.5
9/14 Probabilistic Graphical Models (Part I: Directed)
(slides: 1-49)
Murphy: Secs. 10.1-10.2
Bishop: Secs. 8.1-8.2
9/16 Probabilistic Graphical Models (Part II: Undirected)
(slides: 50-end)
Murphy: Secs. 19.1-19.4
Bishop: Sec. 8.3
HW2 (Due: 9/30)
9/21 Message Passing Inference (Part I: Variable Elimination)
(slides: 1-44)
Murphy: Sec. 20.3
Bishop: Sec. 8.4.1
9/23 Message Passing Inference (Part II: Belief Propagation) Murphy: Secs. 20.1-20.2
Bishop: Sec. 8.4.4-8.4.5
9/28 Message Passing Inference (Part III: Junction Tree, Loopy Belief Propagation)
9/30 Parameter Learning (Part I: MLE, MAP)
10/5 Parameter Learning (Part II: Expectation Maximization)
10/7 Parameter Learning (Part II: EM Cont’d)
10/19 Midterm Exam
10/21 Monte Carlo Methods (Part I: Rejection Sampling, Importance Sampling)
10/26 Monte Carlo Methods (Part II: Markov chain Monte Carlo - Metropolis Hastings)
10/28 Monte Carlo Methods (Part III: MCMC - Gibbs Sampling)
11/2 Dynamical Systems (Part I: Linear Dynamical Systems)
11/4 Dynamical Systems (Part II: Kalman Filter)
11/9 Dynamical Systems (Part II: Nonlinear & Switching Dynamical Systems / Particle Filters)
11/11 Veterans Day - No Class
11/16 Variational Inference (Part I: Mean Field)
11/18 Variational Inference (Part II: Stochastic Variational Inference)
11/23 Variational Inference (Part III: Bethe Free Energy Methods)
11/25 Variational Inference (Part IV: Particle Belief Propagation)
11/30 Topic Models, Latent Dirichlet Allocation
12/2 Bayesian Nonparametrics (Part I: Dirichlet Process)
12/7 Bayesian Nonparametrics (Part II: Hierarchical DP)
12/9 Last Class: Wrapup
12/14 Final Exam

© Jason Pacheco, 2020