Description of Course

The aim of this course is to explore advanced techniques in probabilistic graphical models (PGMs) and statistical machine learning (ML) more broadly. Students will develop the ability to apply these techniques to their own research. Students will learn to perform statistical inference and reasoning in complex probabilistic statistical models. The course will survey state-of-the-art ML research including: variational inference, Bayesian Deep Learning, representation learning, and uncertainty quantification. Upon conclusion of this course students will be capable of developing new methods and advancing the state-of-the-art in ML and PGM research.

Course Management

D2L: https://d2l.arizona.edu/d2l/home/1205997
Piazza: https://piazza.com/arizona/fall2022/csc696h1

Instructor and Contact Information:

Instructor: Jason Pacheco, GS 724, Email: pachecoj@cs.arizona.edu
Office Hours (Zoom): Tuesdays 3-4:30pm, Thursdays 9:00am-10:30am
Instructor Homepage: http://www.pachecoj.com

Date Topic Readings Presenter / Slides
1/10 Introduction + Course Overview (slides)
1/15 Martin Luther King Jr Day : No Classes
1/17 Probability and Statistics : Probability Theory PRML : Sec. 1.2.1-1.2.4

(slides)
1/22 Probability and Statistics : Bayesian Statistics Why Isn't Everyone a Bayesian?
Efron, B. 1986
Objections to Bayesian Statistics
Gelman, A. 2008

(slides)
1/24 Probability and Statistics : Bayesian Statistics (Cont'd)
1/29 Inference : Monte Carlo Methods Introduction to Monte Carlo Methods
MacKay, D. J. C . Learning in Graphical Models. Springer, 1998
(slides)
1/31 Inference : Monte Carlo Methods (Cont'd)
2/5 Inference : Variational Inference Variational Inference: A Review for Statisticians
Blei, D., et al., J. Am. Stat. Assoc. 2017

Optional:
PRML : Sec. 10.1-10.4
(slides)
2/7 Inference: Approximate Bayesian Computation Approximate Bayesian Computation (ABC)
Sunnaker, M. et al. PLoS Computational Biology, 2013
James
(slides)
2/12 Inference: Bayesian Conditional Density Estimation Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation
Papamakarios, G. and Murray, I. NeurIPS, 2016
Varun
2/14 Bayesian Deep Learning: Introduction Weight Uncertainty in Neural Networks
Blundel, C. et al. ICML, 2015
Cameron
(slides)
2/19 Bayesian Deep Learning: Monte Carlo Dropout Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Gal, Y. and Ghahramani, Z. ICML, 2016
Miki
(slides)
2/21 Bayesian Deep Learning: Variational Dropout Variational Dropout and the Local Reparameterization Trick
Kingma, D. P. et al. NeurIPS, 2015
Brenda
2/26 Bayesian Deep Learning: Information Bottleneck Deep Variational Information Bottleneck
Alemi, A. A. et al. ICLR, 2016
Kayla
2/28 Bayesian Deep Learning: Representation Learning InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets
Chen, X. et al. NeurIPS 2016
Daniel
3/4 Spring Recess : No Classes
3/6 Spring Recess : No Classes
3/11 Bayesian Deep Learning : Representation Learning Information Dropout: Learning Optimal Representations Through Noisy Computation
Achille, A. and Soatto, S. PAMI, 2018
Thang
3/13 Generative Models : Variational Autoencoder
Kingma, D. P. and Welling, M. ArXiv, 2019
Natnael
3/18 Generative Models : Variational Autoencoder
3/20 Generative Models : Diffusion Probabilistic Models
3/25 Generative Models : Diffusion Implicit Models
3/27 Generative Models : Energy-Based Models
4/1 Generative Models : Energy-Based Models
4/3 Buffer
4/8 Uncertainty Quantification : Introduction
4/10 Uncertainty Quantification : Variational MI Bounds
4/15 Uncertainty Quantification : MINE
4/17 Uncertainty Quantification : Deep Adaptive Design
4/22 Uncertainty Quantification : Information Noise Contrastive Estimation
4/24 Project Presentations
4/29 Project Presentations
5/1 Project Presentations

© Jason Pacheco, 2022