Graphical Models

 

Instructors

Prof. Volkan Cevher

Prof. Matthias Seeger

 

Description

The course will focus on providing diverse mathematical tools for graduate students from statistical inference and learning; graph theory, signal processing and systems; coding theory and communications, and information theory. We will discuss exact and approximate statistical inference over large number of interacting variables, and develop probabilistic and optimization-based computational methods. We will cover hidden Markov models, belief propagation, and variational Bayesian methods (e.g., expectation maximization algorithm). We will read research papers and book chapters to understand the benefits and limitations of such algorithms.

 

Recommended Reading Material

Textbooks:

  • Christopher M. Bishop, Pattern Recognition and Machine Learning.
  • M.J. Wainwright and M.I. Jordan, Graphical Models, exponential families, and variational inference [advanced].
  • S.L. Lauritzen, Graphical Models [advanced].
  • M.I. Jordan (ed.), Learning in Graphical Models.
  • Daphne Koller and Nir Friedman, Probabilistic Graphical Models [advanced].
  • J. Pearl, Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference.

Tutorial and Research Papers:

  • F.R. Kschischang, B.J. Frey, and H.-A. Loeliger, Factor Graphs and the Sum-Product Algorithm, IEEE Transactions on Information Theory, Vol. 47, No. 2, February 2001.
  • W. Xu, Q. Zhu and M. I. Jordan, The Junction Tree Algorithm, Class notes, UC Berkeley, CS281A/Stat241A, Fall 2004.
  • R. Cowell, Introduction to Inference for Bayesian Networks.
  • M. I. Jordan, Z. Ghahramani, T. S. Jaakkola and L. K. Saul, An Introduction to Variational Methods for Graphical Models, Machine Learning vol. 37, 1999.
  • T. Minka, Divergence Measures and Message Passing, Microsoft Research Ltd. Tech. Report MSR-TR-2005-173, December 2005.
  • V. Cevher, M. Duarte, C. Hegde, and R. Baraniuk, Sparse Signal Recovery Using Markov Random Fields, 2008.
  • R.G. Baraniuk, V. Cevher, M.F. Duarte and C. Hegde, Model-Based Compressive Sensing, 2008.
  • V. Cevher, M.F. Duarte, C. Hegde and R.G. Baraniuk, Sparse Signal Recovery Using Markov Random Fields, 2008.
  • Seeger, M. and Wipf, D, Variational Bayesian Inference Techniques, IEEE SPM 2010.
  • Seeger, M., Tutorial on Sparse Linear Models: Reconstruction and Approximate Inference (http://ipg.epfl.ch/~seeger/lapmalmainweb/teaching/dagm10/index.html).

 

Available Codes

 

Outline

 

   
Week 1 Introduction: Motivation + logistics
  Pre-approved list of papers
   
Week 2 Introduction: Basic probability and Bayes
  Graphical models belief propagation I
  Homework 1
   
Week 3 Graphical models belief propagation II
  Gaussian distribution
  Handout: Gaussian computations
  Homework 2
   
 Week 4 Numerical mathematics/optimization
  Latent variable models
  Handout: Conjugate gradients algorithm
  Homework 3
   
Week 5 Expectation maximization (EM) algorithm
  Dynamical state space models
  Homework 4
   
Week 6 Information theory and variational approximation
  Variational inference relaxations
  Homework 5
   
Week 7 Loopy belief propagation
  Handout: Bethe Free Energy View on Loopy Belief Propagation
  Homework 6
   
Week 8 Sparse linear models/compressible priors
  Homework 7
   
Week 9 Sparse linear models/compressible priors
  Convex/lp relaxations
  Homework 8
   
Week 10 Continuous variable models
  Homework 9
   
Week 11 Inference for continous variable models
  Expectation propagation
   
Week 12 Large scale variational inference
  Class presentations
   
Week 13 No lectures this week.
   
Week 14 Class presentations