EE512A - Advanced Inference in Graphical Models, Fall Quarter, 2014
Last updated: $Id: index.html 1561 2014-12-03 18:40:52Z bilmes $
This page is located at http://j.ee.washington.edu/~bilmes/classes/ee512a_fall_2014/.
A version of this class was taught previously, and its contents and slides can be found at http://j.ee.washington.edu/~bilmes/classes/ee512a_fall_2011/
Instructor:Prof. Jeff A. Bilmes --- Email me
Office: 418 EE/CS Bldg., +1 206 221 5236
Office hours: TBD, EEB-418 (+ online)
TAJounsup Park (firstname.lastname@example.org)
Office hours: Tuesdays, 12:00-2:00pm
Time/LocationClass is held: Mon/Wed 11:30am-1:30pm, Paccar Hall, Room 297.
- HW1 is out, see our assignment dropbox to get it.
- Sticky: Almost all of the announcements, homework solutions, and discussion for our class are going to be posted on canvas, and can be found at this link. However, do see the youtube videos below and you can see everything.
Description: This course will cover certain aspects of advanced inference techniques in graphical models.
We will briefly review exact probabilistic inference (which essentially boils down to inference on trees, and includes methods such as junction trees, exact belief propagation and message passing in its various forms, optimal graph triangulations, the elimination family of algorithms, the generalized distributed law (GDL), and its relation to dynamic programming). We'll also cover NP hardness results for inference, NP hardness results for finding the best form, and the inapproximability of inference in graphical models (which seems bad).
Next we'll move to the optimistic aspect of the course, where we will cover two general classes of approximate inference methods.
The first general class of techniques is that of exponential models and variational methods. We will follow the 2008 book by Wainwright and Jordan pretty closely and which you can get from the link above if you are at an educational institution (although I urge you to purchase a copy as it is not too expensive). Of particular interest will be the polyhedral approaches (e.g., the marginal polytope), and the linear programming relaxation methods mentioned in this book.
The second is a class of inference algorithms (more precisely, algorithms for finding the most probable configuration of a set of random variables, or called MPE (most probable explanation) or Viterbi inference) that have become popular in the compute vision community. While we will not dwell on the computer vision applications, we will most abstract the algorithms for use in general graphical models and we might draw from vision examples to show results. These include algorithms that are used when the tree-width of the model very high and that, even in the high tree-width case, can sometimes provide exact MPE solutions in low-order polynomial time. This includes many of the graph cut methods, including higher order approximations, and methods and techniques when the variables are not binary (including alpha-beta swaps, alpha expansions, fusion moves, and other recent more sophisticated and energy aware "move making" algorithms to that can improved efficiency, such as various forms of "expansion" moves). We will also discuss what to do with global potential functions (i.e., ones where a factor might involve many variables).
We will be covering papers of the form that you can find in a new text on inference in Markov random fields for computer vision although, again, we we will talk about the methods more generically (the methods indeed are much more widely applicable than just to problems in computer vision, yet there has been much innovation in graphical model inference in the computer vision community recently. We will also draw from papers that have recently appeared (the links will be posted here on this web page as the class proceeds).
Prerequisites: ideally knowledge in probability, statistics, convex optimization, and some combinatorial optimization although these will be reviewed as necessary. It would be useful to have some basic knowledge of graphical models as we'll review the basics fairly rapidly. The course is open to students in all UW departments. If you are in doubt about taking this course, please talk to me in class or office hours.
Texts: see above.
Grades and Assignments: Grades will be based on a combination of a final project (35%), homeworks (35%), and the take home midterm exam (30%).
There will be between 3-5 homeworks during the quarter. Some of the homeworks will include project proposals, project proposal revisions, and project progress reports.
Final project: The final project will consist of a 4-page paper (conference style) and a final project presentation. The project must involve using or dealing mathematically with graphical models in some way or another. Please contact me and/or stop by office hours early in the quarter to discuss project ideas. Note there is a chance that for the final project, we will instead have a programming contest for probabilistic inference, more on this soon.
HomeworkHomework must be done and submitted electronically as a PDF file via the following link https://canvas.uw.edu/courses/914697/assignments. The PDF file, however, can be a scanned copy of hand-written solutions.
Lecture SlidesLecture slides will be made available as they are being prepared --- they will probably appear soon before a given lecture, and they will be in PDF format (original source is latex). Note, that these slides are corrected after the lecture (and might also include some additional discussion we had during lecture). If you find bugs/typos in these slides, please email me. The slides are available as "presentation slides" and also in (mostly the same content) 2-up form for printing. After lecture, the marked up slides will appear under the "presented slides" column (and might include typo fixes, ink corrections, and other little notes/discussions/drawings that occured during class). Also, the lectures will be posted to youtube (see below, or see my UW youtube channel).
|Lec. #||Slides||2-Up Slides||Lecture Date||Last Updated||Contents||Presented Slides||Video|
|1||9/29/2014||9/29/2014||Introduction, Families, Semantics|
|2||10/1/2014||10/1/2014||MRFs, Inference on Trees|
|3||10/6/2014||10/6/2014||Tree inference, more general queries, non-trees.|
|4||10/8/2014||10/8/2014||Non-trees, perfect elimination, triangulated graphs|
|5||10/13/2014||10/13/2014||triangulated graphs, k-trees, the triangulation process/heuristics|
|6||10/15/2014||10/15/2014||multiple queries, decomposable models, junction trees|
|7||10/20/2014||10/22/2014||junction trees, begin intersection graphs|
|8||10/22/2014||10/22/2014||intersection graphs, inference on junction trees|
|9||10/27/2014||11/2/2014||inference on junction trees, semirings, conditioning|
|10||11/3/2014||11/4/2014||conditioning, hardness, LBP|
|11||11/5/2014||11/9/2014||LBP, exponential models,|
|12||11/10/2014||11/10/2014||exponential models, mean params and polytopes, tree outer bound|
|13||11/12/2014||11/19/2014||polytopes, tree outer bound, Bethe entropy approx.|
|14||11/17/2014||11/17/2014||Bethe entropy approx, loop series|
|15||11/19/2014||11/23/2014||Hypergraphs, posets, Mobius, Kikuchi|
|16||11/24/2014||11/26/2014||Kikuchi, Expectation Propagation|
|17||11/26/2014||12/1/2014||Expectation Propagation, Mean Field|
|18||12/1/2014||12/2/2014||Structured mean field, Convex relaxations and upper bounds, tree reweighted case|
|19||12/3/2014||12/3/2014||Variational MPE, Graph Cut MPE, LP Relaxations|
|Lec. #||Slides||2-Up Slides||Lecture Date||Last Updated||Contents||Presented Slides||Video|
You can post questions, discussion topics, or general information at this link.
Relevant BooksThere are many books available that discuss some the material that we are covering in this course. Some good books are listed below, but see the end of the lecture slides for books/papers that are relevant to each specific lecture. The first two books we will be covering closely this term.
- Wainwright and Jordan, Graphical Models, Exponential Families, and Variational Inference. This will be one of the main books we will cover this term.
- Blake, Kohli, and Rother, Markov Random Fields for Vision and Image Processing is a very recent book that will cover key aspects of the MRF inference techniques we will talk about.
- Koller and Friedman, Probabilistic Graphical Models
- Kevin Murphy, Machine Learning: a Probabilistic Perspective
- "An Introduction to Bayesian Networks", F.V. Jensen, 1996. A good general introduction to Bayesian networks (out of print, but available in the library).
- "Bayesian Networks and Decision Graphs", F.V. Jensen, 2001. Another good general introduction to Bayesian networks (not out of print).
- "Graphical Models", S.L. Lauritzen. Oxford, 1996. A very complete, theoretically precise, but dense text, authored by one of the field's leading authorities.
- "Probabilistic Networks and Expert Systems", R.G. Cowell, A.P. Dawid, S.L. Lauritzen, and D.J. Spiegelhalter. 1999. Similar to the previous text, but includes more material on inference, applications, and other general problems.
- "Artificial Intelligence: A Modern Approach: 2nd Edition", S. Russel and P. Norvig, 2003. Has a nice introductory chapter on Bayesian networks.
- "Learning in Graphical Models", Ed. by M.I. Jordan. An excellent collection of recent research papers compiled by Mike Jordan, one of the leading experts in this field.
- "Probabilistic Reasoning in Intelligent Systems", J. Pearl. 1998. A classic early text by one of the founders of the field. Pearl is credited with inventing the term "Bayesian networks".
- "Causality", J. Pearl. 2000. A relatively newer text by Pearl specifically on causality, and causal modeling. A second edition of this book is out in 2009.
- "Pattern Classification", R. Duda, P. Hart and D. Stork (the text used for 596I). The original text (published in 1973) is still widely read.
- "The Elements of Statistical Learning: Data Mining, Inference, and Prediction", Hastie, Tibshirani, and Friedman. There is a 2nd edition that came out in 2008.
- "Neural Networks for Pattern Recognition", by C. Bishop, 1996. (available now in the UW bookstore). This book mainly contains background material, but has become a classic text in pattern recognition even though "neural networks" is in the title, and is worth reading if you plan to do any work at all in pattern recognition.
- Another book by C. Bishop that came out in 2006 is "Pattern Recognition and Machine Learning", which is here. This book also contains a nice overview chapter on graphical models and Bayesian networks.
- Final project presentations: Wednesday, December 10, 2014, 230-420 pm, PCAR 297.
If you must, you can send me or the TA anonymous email