CS 688, Spring 2016, UMass Amherst CS

Instructor: Brendan O’Connor, brenocon AT cs.umass.edu

Lecture: MW 2:30-3:45, Engineering Laboratory, room 304

TA: Tao Sun, taosun AT cs.umass.edu

See the Moodle and Piazza sites for this course.

Probabilistic graphical models are an intuitive visual language for describing the structure of joint probability distributions using graphs. They enable the compact representation and manipulation of exponentially large probability distributions, which allows them to efficiently manage the uncertainty and partial observability that commonly occur in real-world problems. As a result, graphical models have become invaluable tools in a wide range of areas from computer vision and sensor networks to natural language processing and computational biology.

The aim of this course is to develop the knowledge and skills necessary to effectively design, implement and apply these models to solve real problems. The course will cover (a) Bayesian and Markov (MRF) networks; (b) exact and approximate inference methods; (c) estimation of both the parameters and structure of graphical models. Students entering the class should have good programming skills and knowledge of algorithms. Undergraduate-level knowledge of probability and statistics is recommended.

This course emphasizes the versatile compositionality of PGMs’ core building blocks—intuitive representations and algorithms which can be combined to derive a huge variety of models and inference/learning procedures. Probabilistic graphical models unify a very broad range of statistical and machine learning methods, and allow the invention of new ones too. These concepts are essential for advanced work in machine learning, artificial intelligence, and statistical modeling.

An approximate list of topics (subject to change):

- Introduction and Probability Theory
- Bayesian Networks
- KL Divergence and Learning in Bayesian Networks
- Markov Random Fields
- Inference in Markov Random Fields
- Exact Inference by Message Passing
- Sum-Product Implementation and Learning Markov Random Fields
- Markov Random Fields and Bayesian Networks
- Particle Respresentations and Monte Carlo Integration
- Markov Chains and MCMC Methods
- Metropolis Hastings Algorithm
- MCMC and Learning
- Variational Inference
- Variational Learning
- Loopy Belief Propagation
- Factor Graphs
- Bayesian Inference

** Textbook:** Kevin Murphy (2012), “Machine Learning: a Probabilistic Perspective.” (book website).

** Who should take this course?:** This is a PhD-level course that is designed for students who want depth in statistical machine learning. It may also be useful for related areas such as probabilistic inference in intelligent systems. This course focuses on mathematical formalisms, algorithms, and models. If you haven’t taken a course in machine learning, artificial intelligence, or statistical modeling before (e.g. CS 589, CS 689, CS 383, CS 683, STAT 597*, STAT 697*, etc.), the motivation for this course may not be as clear.