January 14 (Tue), 2020 Introduction to the Course: Goals and Topics to Be Covered. Example of plausible reasoning and causal graphs: icy roads Reasoning in the evidential and causal direction.
January 16 (Thu), 2020 HW1: Do exercises 2.1, 2.2, and 2.3 [J07], due on Thursday, 2020-01-23. Examples of plausible reasoning and causal graphs: icy roads and wet grass. A first look at the Hugin Bayesian network and influence diagram tool. Reasoning in the evidential and causal direction. Detailed small example of computation of certainties in a compositional system. Problems with compositional system. The Chernobyl example and the importance of structure in a causal model.
January 21 (Tue), 2020 The earthquake example. Probability. The three axioms of Kolmogorov [1950]. Outcome space, event space, probability measure. Three historically important interpretations in which the axioms of Kolmogorov are true (i.e., three models of them): classical, frequentist, and subjective. Presentation of the classical approach, including sketch of the proof of the three properties that are the axioms of Kolmogorov. Brief presentation of the frequentist approach. Subjective probability defined from betting behavior: "[De Finetti] defines the probability P(E) of an event E as the fraction of a whole unit value which one would feel is the fair amount to exchange for the promise that one would receive a whole unit of value if E turns out to be true and zero units if E turns out to be false" [Neapolitan, 1990, p.56]. Similarly, subjective probability could be defined using the equivalent urn model. The probability P(E) of an event E is the fraction of red balls in an urn containing red and brown balls such that one would feel indifferent between the statement "E will occur" and "a red ball would be extracted from the urn." (I believe that this definition is due to D.V. Lindley.) Kolmogorov's axioms and the "definition" of conditional probability can be derived from the definition of subjective probability and the assumption of coherence in (betting) behavior, as explained on the slides. Neapolitan's [1990] definition of Bayesian network.
January 23 (Thu), 2020 HW1 will be accepted until Tuesday, January 28, without penalty. HW2 assigned: exercises 1.7, 1.11, 1.12, 1.13 [J07], due on Thursday, January 30. Prerequisites on Probability Theory: Ch. 1 [J07] completed. Causal networks and d-separation through most of section 2.2 [J07].
January 28 (Tue), 2020 HW3 assigned: exercises 2.4, 2.5, 2.6, 2.7, 2.8, 2.9, 2.10, 2.12, 2.14 [J07], due on Thursday, February 6. Definition of d-separation and d-connectedness (def. 2.1[J07]). Examples of d-separation. Definition of Markov blanket (def. 2.2 [J07]). The claim at the end of p.30 [J07] discussed briefly. Lauritzen's d-separation algorithm using moralized ancestral graphs. (A proof of correctness has been posted on the course website under "Lecture Notes.")
January 30(Thu), 2020 Bayesian networks consist of a DAG and a set of conditional probability tables. Derivation of the chain rule for Bayesian networks from the definition by Neapolitan {1990). The local Markov condition for Bayesian networks. Detailed computation of initial and updated probabilities in the icy roads and wet grass examples.
February 4 (Tue), 2020 HW3 was split into two parts, as follows. HW3: exercises 2.4, 2.5, 2.6, 2.7, 2.8, 2.9 [J07], due on Thursday, February 6. HW4: exervises 2.10, 2.12, and 2.14 [J07], due on Tuesday, February 11. Correction in class of HW1 and HW2. Continuation of authors' slides for Ch.2 [J07].
February 6 (Thu), 2020 Completion of authors' slides for Ch.2 [J07]. The "Visit to Asia" ("Chest Clinic") Bayesian network. Bucket elimination for belief update.
February 11 (Tue), 2020 The secondary optimization problem of Non-serial Dynamic Programming: finding an optimal elimination ordering. Interaction graphs. Min-degree and min-deficiency (a.k.a. min fill-in) heuristics. Bucket elimination for the MPE (Most Probable Explanation) problem.
February 13 (Thu), 2020 HW5 assigned: exercise 2.21 [J07], due on Tuesday, 2020-02-18; please use the current version of Hugin on the departmental Linux machine or download an earlier version as explained in blackboard. HW6 assigned: exercises 2.22 and 2.23 [J07], due on Thursday, 2020-02-20; for exercise 2.22, please use the four-variable version of the earthquake network (Earthquake, Burglary, Alarm, RadioReport). Correction of HW3 and HW4. I-equivalence and the Chernobyl example, revisited.
February 18 (Tue), 2020 Due date of HW6 changed to 2020-02-25. Discussion of Hugin (old version) installation in Windows. Example of use of Hugin in Windows. Example of use of Hugin 8.8 (current version) in Linux using MobaXterm. Use of the Linux version is recommended. Chapter 3 ("Building Models") [J07] started. Infected milk example, with several variations: one-day, seven-day with Markov property, seven-day with a two-day memory of infection, seven-day with a one-day memory of correctness of test.
February 20 (Thu), 2020 HW7: Exercises 3.3 3.5, 3.6, 3.6, 3.8, 3.9 [J07] due on Thursday, February 27. (Please use Hugin for all exercises in this assignment, including 3.3), Chapter 3 ("Building Models") [J07] continued, up to the beginning of 3.2.3 (Poker Game).
February 25 (Tue), 2020 Chapter 3 ("Building Models") [J07] continued, up to 3.3.5 ("Expert Disagreement").
February 27 (Thu), 2020 The midterm exam will be on Thursday, March 5. HW8 assigned: Exercises 3.10 (i only), 3.12 (i and ii only), 3.13 (i-iii only), due on Tuesday, March 10. (Later delayed to 2020-03-17.) (Note: students are encouraged to do the exercises by Thursday, March 5, in preparation of the midterm exam.) Correction of HW5 and HW6. Chapter 3 ("Building Models") [J07] continued: Joint Probabilities, Most-Probable Explanation, Data Conflict (started).
March 3 (Tue), 2020 Correction of HW7. Bayesian network with continuous variables, with an example: the angina network with a thermometer. The stratum method for constructing Bayesian networks. Discussion of some issues concerning causality and intervention.
March 6 (Tue), 2020 Midterm exam.
March 24 (Tue), 2020 First online class after extended spring break. The course continues with a modified time allocation framework and syllabus that allows asynchronous delivery, with prerecorded videos and reading assignments. HW8 assigned: Exercises 3.10 (i only), 3.12 (i and ii only), 3.13 (i-iii only), due on March 31. Class agenda. Correction of MT1. [J96] started.
March 26 (Thu), 2020 [J96] continued up to the statement of Theorem 4.6. Class agenda. Notice about remote lab access to CEC computer labs; Hugin 8.8 is installed on the lab machines.
March 31 (Tue), 2020 Correction of HW8. Section 3.3.7 [J07]: Dynamic Bayesian networks and related repetitive models: Hidden Markov models and Kalman filters. Time slices. [J96] continued through the end of Section 4.4. Class agenda.
April 2 (Thu), 2020 [J96] continued through the end of the presentation of the junction tree algorithm (Section 4.5). Reference to Jensen and Jensen's "Optimal Junction Trees" paper in UAI-94, available on the main course website, for the proof of Theorem 4.11. The agenda was edited to better reflect what was covered in class. Class agenda.
April 7 (Tue), 2020 Please check the Time Allocation Framework page for assignments. HW9 is due on 2020-04-09. Update: no penalty for HW9 submitted by 2020-04-14. HW10 is due on 2020-04-16. Brief comments on cutset conditioning (briefly described in Exercise 4.11 [J96]) and recursive conditioning (section 4.7.1 [J07]), as examples of algorithms for exact belief computation with bounded space. Stochastic simulation, presented following Section 4.6 [J96]: probabilistic logic sampling and Gibbs sampling. Brief discussion of likelihood weighting (section 4.8.2 [J07]). Brief discussion of Loopy Belief Propagation (4.9 [J07]). Other exact methods: Factor Trees (link given in agenda). Class agenda.
April 9 (Thu), 2020 Learning the Structure of Bayesian Networks: Introduction, Constraint-Based Learning, and the PC Algorithm (Ch.7 [J07]). Applying PC to the Visit to Asia Example, and the problem of functional nodes (such as E), whose presence breaks the faithfulness condition. The agenda was edited to better reflect what was covered in class. Class agenda.
April 14 (Tue), 2020 Value of Information: Section 11.1 [J07] with additional material from Section 5.5 [J96], including an example of non-myopic data request. The agenda was edited to better reflect what was covered in class. Class agenda.
April 16 (Thu), 2020 HW 10 will be accepted without penalty until next Tuesday ay class time. Missing data and the EM algorithm (Section 6.2 [J07]). Likelihood Weighting and Gibbs Sampling (Section 4.8 [J07]). Please review carefully item 9 in the attached agenda. Class agenda. We did not cover item 11.
April 21 (Tue), 2020 Please do the course survey online in Blackboard. Score-based learning of Bayesian network structure. The excision semantics for causality. The Coins and Bell example revisited. Intervention and the excision semantics for causality. Identifiability of causal effects. Fisher's genotype model of smoking (1958) and unidentifiability of the casual effect of X on Y in this model. Please review carefully item 10 in the attached agenda. Class agenda.
April 23 (Thu), 2020 Discussion of take-home final and graduate student presentations. Students are asked to submit course reviews via Blackboard. The review tool is called "My Course Evaluations" and is found in the same area (Tools) as Blackboard Collaborate Ultra. End of course. Class agenda.