Probabilistic graphical modeling and inference is a powerful modern approach to representing the combined statistics of data and models, reasoning about the world in the face of uncertainty, and learning about it from data. It cleanly separates the notions of representation, reasoning, and learning. It provides a principled framework for combining multiple sources of information, such as prior knowledge about the world, with evidence about a particular case in observed data. This course will provide a solid introduction to the methodology and associated techniques, and show how they are applied in diverse domains ranging from computer vision to computational biology to computational neuroscience.
All reading material will be made available through presentation slides or the course webpage. Students will find the following optional textbooks useful throughout this course:
Instructor: Jason Pacheco, GS 724, Email: pachecoj@cs.arizona.edu
Office Hours: Tuesday @ 34:30pm (Tucson time)
Optional Office Hours:Thursday @ 910:30am (Message professor on Piazza at least the day before)
D2L: https://d2l.arizona.edu/d2l/home/937505
Piazza: https://piazza.com/arizona/fall2020/csc535
Instructor Homepage: http://www.pachecoj.com
Date  Topic  Readings  Assignment 

8/24  Introduction + Course Overview (slides)  Matlab Primer  
8/26 
Probability Primer (Part I) (slides: 124) 
Murphy: Secs. 2.1 and 2.2  
8/31 
Probability Primer (Part II) (slides: 2542) 
Murphy: Secs. 2.3  2.5 Bishop: Secs. 2.1  2.3.2 
HW1 (Due: 9/9) 
9/2 
Probability Primer (Part III) (slides: 43end) Bayesian Probability and statistics (Part I) (slides) 
Murphy: Secs. 3.13.2 Bishop: Sec. 1.2 

9/7  Labor Day  No Class  
9/9 
Bayesian Probability and Statistics (Part II) (slides) 
Murphy: Secs. 3.33.5  
9/14 
Probabilistic Graphical Models (Part I: Directed) (slides: 149) 
Murphy: Secs. 10.110.2 Bishop: Secs. 8.18.2 

9/16 
Probabilistic Graphical Models (Part II: Undirected) (slides: 50end) 
Murphy: Secs. 19.119.4 Bishop: Sec. 8.3 
HW2 (Due: 9/30) 
9/21 
Message Passing Inference (Part I: Variable Elimination) (slides: 144) 
Murphy: Sec. 20.3 Bishop: Sec. 8.4.1 

9/23 
Message Passing Inference (Part II: SumProduct Belief Propagation (BP)) (slides: 4689) 
Murphy: Secs. 20.120.2 Bishop: Secs. 8.4.18.4.4 

9/28 
Message Passing Inference (Part II: SumProduct BP Cont'd) (slides: 4689) 

9/30 
Message Passing Inference (Part III: MaxProduct / MaxSum BP) (slides: 90109) 
Bishop: Sec. 8.4.5  
10/5 
Message Passing Inference (Part IV: Junction Tree Algorithm) (slides: 109131) 
Murphy: Sec. 20.4 
HW3 (Due: 10/19) Handout Code: Factor Graph, LDPC 
10/7 
Message Passing Inference (Part V: Loopy Belief Propagation) (slides: 131end) 
Murphy: Sec. 22.2  
10/12 
Parameter Learning (Part I: MLE, MAP) (slides: 128) 

10/14 
Parameter Learning (Part II: Expectation Maximization) (slides: 30) 

10/19 
Parameter Learning (Part II: EM Contâ€™d) (slides: 30) 

10/21  Monte Carlo Methods (Part I: Rejection Sampling, Importance Sampling)  
10/26  Monte Carlo Methods (Part II: Markov chain Monte Carlo  Metropolis Hastings)  Midterm Exam  
10/28  Monte Carlo Methods (Part III: MCMC  Gibbs Sampling)  
11/2  Dynamical Systems (Part I: Linear Dynamical Systems)  
11/4  Dynamical Systems (Part II: Kalman Filter)  
11/9  Dynamical Systems (Part II: Nonlinear & Switching Dynamical Systems / Particle Filters)  
11/11  Veterans Day  No Class  
11/16  Variational Inference (Part I: Mean Field)  
11/18  Variational Inference (Part II: Stochastic Variational Inference)  
11/23  Variational Inference (Part III: Bethe Free Energy Methods)  
11/25  Variational Inference (Part IV: Particle Belief Propagation)  
11/30  Topic Models, Latent Dirichlet Allocation  
12/2  Bayesian Nonparametrics (Part I: Dirichlet Process)  
12/7  Bayesian Nonparametrics (Part II: Hierarchical DP)  
12/9  Last Class: Wrapup  
12/14  Final Exam 