Class Home
News
Syllabus
Lectures
Resources

E0 269: Probabilistic Graphical Models (Jan-Apr 2011)

Instructor: Indrajit Bhattacharya

Schedule: T,Th 11:30-1:00 CSA - 117

Pre-requisite:

  • Introduction to Probability and Statistics
  • Consent of Instructor

Text/Reading:

  • "Probabilistic Graphical Models: Principles and Techniques", Daphne Koller and Nir Friedman,
  • Relevant papers
Grading Policy: Class Assignments, Midterm, Final Exam / Course Project, Class Participation

News

April 14: New project report deadline: 19/4 Tuesday
April 14: Final Lecture 14/4 2-4pm
April 5: Handout on Junction Tree Algorithm
Mar 31: Handout on HMMs
Mar 24: Final Project Report due April 15
Mar 24: Handout on EM and Mixture Models
Mar 22: Exam 2 graded
Mar 17: Handout on Parameter Estimation
Mar 15: Project Proposals due this week
Feb 10: Handout on Sum Product
Feb 10: Exam 1 graded
Feb 10: Handout on Variable Elimination
Jan 25: Handout on Undirected Graphical Models
Jan 20: First Mid Term Exam on Feb 1
Jan 20: Handout on Directed Graphical Models
Jan 6: First class

Tentative Syllabus

Graph types: conditional independence; directed, undirected, and factor models; algorithms for conditional independence, d-separation, Markov properties on graphs, factorization, Hammersley-Clifford theorem.
Static Models: linear Gaussian models, mixture models, factor analysis, Markov Random Fields, Gibbs distributions, static conditional random fields (CRFs), multivariate Gaussians as graphical models, Exponential family, generalized linear models
Dynamic Models: Hidden Markov Models, Kalman filtering and linear-Gaussian HMMs
Exact Inference: The elimination family of algorithms. Relation to dynamic programming, belief propagation, Junction trees, optimal triangulations. NP hardness results.
Approximate Inference: Loopy belief propagation (BP), expectation propagation (EP), Sampling (Markov Chain Monte Carlo, Metropolis Hastings, Gibbs) Particle filtering
Structure Learning: Chow Liu algorithm
Latent Dirichlet Allocation: Exchangeability, de Finetti Theorem, Inference using collapsed Gibbs sampling

Lecture Schedule

Thu 06/1: Class Introduction and Logistics
Tue 11/1: Introduction to Graphical Models
Thu 13/1: Directed Graphical Models: Factorization
Tue 18/1: Directed Graphical Models: Conditional Independence
Thu 20/1: Undirected Graphical Models: Conditional Independence
Tue 25/1: Undirected Graphical Models: Factorization
Thu 27/1: Wrapping up GM Semantics; Intro to Inference
Tue 1/2: Midterm 1
Thu 3/2: Variable Elimination: Warming up
Tue 8/2: Variable Elimination Analysis
Thu 10/2: Variable Elimination Summary; Exam Discussion
Tue 15/2: Sum Product Algorithm
Thu 17/2: Sum Product and Factor Graphs
Tue 22/2: MAP Inference and Max Product
Thu 24/2: Review: Density Estimation and Regression
Tue 1/3: Review: Classification and Generative Models
Thu 3/3: Review: Discriminative Models
Tue 8/3: No Class
Thu 10/3: Second Mid Term Exam
Tue 15/3: Parameter Estimation in Completely Observed Graphical Models
Thu 17/3: Mixture Models, Latent Variables, EM
Tue 22/3: Exam Discussion, EM Analysis
Thu 24/3: Parameter Estimation in Partially Observed Graphical Models
Tue 29/3: Hidden Markov Model
Thu 31/3: Junction Tree Algorithms
Tue 5/4: Sampling Algorithms for Inference
Thu 7/4: Sampling Algorithms for Inference
Tue 12/4: Topic Models
Thu 14/4: Topic Models

Additional Reading

Graphical Model Resources