CS 8803 PGM
Introduction to Probabilistic Graphical Models

Spring 2007
Klaus 1447
MW 3:00 - 4:00

Grading        Syllabus        Projects


This course provides an introduction to the theory, algorithms, and applications of probabilistic graphical models. These models represent the joint distribution of a set of random variables as a factorization over a graph. They include Bayesian networks, Markov random fields, and Hidden Markov models as special cases. Topics include representation, inference, parameter and structure learning, variational inference, belief propagation, and causality. The development will be motivated by applications in computer vision and speech, decision support, and bioinformatics.

Instructor

Jim Rehg
Email: rehg@cc.gatech.edu
Office: TSRB 230B
Office hours: After class or by appointment
Phone: 404-894-9105 (email preferred)

Prerequisites

Familiarity with probability, statistics, and linear algebra is essential. Familiarity with Matlab will be helpful for some of the assignments. For CS students, CS 8803 Mathematics for Computational Perception provides a satisfactory background for this class. Please contact me if you have questions.


Text

Draft chapters will be handed out in class from the following unpublished books:

Students wishing to purchase a textbook should consider:


Organization

Grades will be assessed as follows:

Problem Sets 50%
Midterm 10%
Final Project 40%

A single homework problem will be given at the end of most lectures. Homework write-ups will be collected at the start of the following class, and the solution will be provided in-class. My intent is to encourage you to keep up with the material and attend the lectures. Collaboration on the homework is encouraged at the "white board interaction" level. That is, share ideas and technical conversation, but do your own detailed derivations, write your own code, etc. Some assignments may require you to use Matlab.

There is a take-home midterm and a final project. Instead of a final exam, we will have oral presentations of final project reports during the final exam period. You are encouraged to do a final project which is related to your research interests. Project proposals will be due shortly after the midterm. Depending on the number of enrolled students, I may require you to work in groups of two or more for the final project.

No late submissions of homework will be accepted. There will be a sufficient number of problems that missing a few due to absences will not significantly affect your grade. Undergraduate and graduate students will be graded in an identical  manner in this course.


Syllabus

  1. Introduction
  2. Bayesian network
  3. Markov network
  4. Exact inference
  5. Parameter learning
  6. Approximate inference
  7. Structure learning and causality

Problem Sets


Resources

Overview

Applications

Help with Matlab

 


Final Projects

Schedule

Possible Projects