Lecture # Date Link and Topic
1 08/30/2017 Course information, Introduction to factorization and message passing
2 09/01/2017 Independence, Pairwise independence, Conditional independence
3 09/04/2017 Introduction to Bayesian Networks
4 09/06/2017 Topological ordering
Factorization implies I-MAP
Text classification using Naive Bayes classifier
5 09/11/2017 Introduction to pgmpy
Misconception example to motivate the need for undirected graphical models
6 09/13/2017 Introduction to undirected graphical models
7 09/15/2017 Factorization implies indendencies, 3 types of independencies
8 09/18/2017 Proof of the Hammersley Clifford Theorem
9 09/15/2017 Introduction to Factor graphs
Examples of graphical models - Error correcting codes, K-SAT problems
10 09/25/2017 State space models and factor graphs
11 09/27/2017 A bit of statistical physics - Boltzman distribution, free energy, 1-D and 2-D Ising models, phase transitions
12 10/02/2017 Sum product algorithm for factor trees
13 10/04/2017 Iterative implementation of the sum product algorithm
Sum-product algorithm for HMMs
14 10/06/2017 Forward Backward algorithm for HMMs
15 10/09/2017 Recursion in the FB algorithm
Min-sum algorithm, Max-product, Viterbi algorithm, generalization to semi-rings
16 10/11/2017 Gaussian graphical models
17 10/13/2017 Kalman filter and message passing
18 10/16/2017 Gaussian BP on factor trees
19 10/18/2017 Loopy Belief Propagation
Junction Trees
20 10/20/2017 Junction Trees (continued)
21 10/23/2017 Chordal graphs and variable elimination
Project sugge stions
22 10/25/2017 Example showing the sub-optimality of loopy BP
Review of some information-theoretic quantities
10/30/2017 Variational Inference
11/01/2017 Variational Inference and interpretation of BP as an iterative solution to variational free energy minimization
11/03/2017 More inference as optimization
11/06/2017 Midterm exam
11/08/2017 Inference as optimization
11/13/2017 Overview of learning in graphical models
11/15/2017 Parameter estimation in directed graphical models 2
11/20/2017 Parameter estimation from partially observed (incomplete) data - problem statement
ML parameter estimation is hard with incomplete data
Description of the Expectation Maximization (EM) algorithm
EM algorithm for mixture Gaussian models Part 1
EM algorithm for mixture Gaussian models Part 2
The EM algorithm converges to a local maximum
11/27/2017 Baum-Welch Algorithm for estimating the parameters of a HMM
11/29/2017 Structure learning - overview, ML approach to structure learning
12/01/2017 Chow-Liu algorithm, Bayesian score and its relation to the ML score
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License