CSCE 582 {=STAT 582} (Spring 2023) Lecture Log

January 12 (Tue), 2023 Introduction to the Course: syllabus, objectives, course structure, and topics to be covered. Bayesian networks and decision graphs as a topic area with artificial intelligence. Example of plausible reasoning and causal graphs: icy roads.

January 15 (Thu), 2023 Examples of plausible reasoning and causal graphs: icy roads, wet grass, earthquake, car start. Networks drawn using Hugin; numbers entered for wet grass example. Reasoning in the evidential and causal direction. Intercausal reasoning and explaining away. Causal networks and d-separation through parts of section 2.2 [J].

January 17 (Tue), 2023 Computation of certainty factors in a compositional system. Problems with compositional system. The Chernobyl example with Naive Bayes and five-variable causal networks; the importance of structure in a causal model. A first look at Hugin. The (Kolmogorov) axioms of probability (with the "definition" of conditional probability as a fourth axiom). Proof that the classical model of probability satisfies the first three axioms.

January 19 (Thu), 2023 HW1 will consist of exercises 1.11, 1.12, and 1.13 (due date not set). Proof that the Classical and subjective interpretations of probability are models of the axioms. Conditional probability as a fourth axiom of probability. The fundamental rule and Bayes' rule. Probability prerequisites; potentials (Ch.1 [J]).

January 24 (Tue), 2023 HW1 assigned: exercises 1.11, 1.12, and 1.13 (due on 2023-01-31). An algebra of potentials. Ch.2 [J] up to 2.3.1 included. Examples of computation of d-separation. Definition of causal network. The clarity (or clairvoyance) principle. Conditional probability tables for Bayesian networks. "Constructive" definition of Bayesian networks.

January 26 (Thu), 2023 HW2 assigned: exercises 2.1 -- 2.6 [J], due on 2023-02-02. Note that the due date for HW1 in the previous entry was incorrect. Evidence, findings, probability of evidence. Bayesian rule with probability potentials: zeroing out and normalizing. Variable elimination. Munin. Probabilistic graphical models and their advantages. Ch.2 [J] slides from the authors completed.

January 31 (Tue), 2023 Definition of Bayesian networks, accoring to [Neapolitan, 1990]. The local Markov condition. Theorem 3.7 in [Lauritzen, 1996]: equivalence of the (directed) recursive factorization, global Markov property, and local Markov property. Proof of the chain rule for Bayesian networks (i.e., recursive factorization) from Neapolitan's definition (i.e., the global Markov property. The visit to Asia example.

February 2 (Thu), 2023 Example of construction of Bayesian network structure from qualitative considerations: Cooper's papilledema network as presented in [Neapolitan, 1990]. A successful Bayesian network: BOBLO (BOvine BLOod Typing). Detailed example of bucket elimination for solving the belief update problem, using visit to Asia.

February 7 (Tue), 2023 HW3 assigned: exercises 2.7-2.10, 2.12, and 2.14 [J], due 2023-02-16. MPE defined. Example of solving MPE using bucket elimination. MAP. Axioms for local computation. Elinination ordering: interaction (domain) graphs; minimum deficiency and minimum degree heuristics. Moral graphs. Lauritzen's algorithm for d-separation. Markov blankets.

February 9 (Thu), 2023 Correction of HW2 in class, with discussion. Ch.3 [J] ("Building Models") started: Section 3.1 ("Catching the Structure") started: the infected milk example: one-day, seven-day with Markov assumption, seven-day with persistent infection and with persistent test errors.

February 14 (Tue), 2023 HW4 assigned: exercises 2.17, 2.18. 2.19 (Note errata: DAG(a) should be B<-A->C.), 2.20 (install Hugin or show evidence of use on a departmental Linux machine), and 2.23 [J] due 2023-02-23. (Original due date was 02-21.) "Catching the Structure" (Section 3.2 [J]) completed. The stratum method.

February 16 (Thu), 2023 Equivalent classes of Bayesian networks. Bayesian networks in the same equivalence class are indistinguishable from (observational) data, but may encode different causal assumptions. Faithfulness. Mimicking. Where do the numbers come from? Section 3.3 [J] started.

February 21 (Tue), 2023 HW5 assigned: exercises 3.3, 3.5, 3.6, 3.8, and 3.9 [J] due 2023-02-28. You must use Hugin for all of these exercises. Hugin version linked to blackboard does not work any longer. Students need to use the version of Hugin on the CSE Linux machines or on the CEC Windows machines. Notice about remote lab access to CEC Windows computer labs; Hugin 9.1 is installed on the lab machines. Section 3.3 [J] completed.

February 23 (Thu), 2023 Midterm exam date confirmed: it will be on 2023-03-02, as indicated in the syllabus. Please note that you need to use Hugin for *all* exercises in HW5, including 3.3. Hybrid Bayesian networks, with continuous (Gaussian and conditional Gaussian) variables with "cold or angina" example. Sum-propagate (Hugin realization of belief update aka belief assessment aka computation of posterior probability) in Hugin vs. Max-propagate (Hugin realization of MPE computation) with "transmission of symbol strings" example.

February 28 (Tue), 2023 Q&A on the midterm, which will be on March 2 (next class). A brief eulogy for Martin Davis (March 8, 1928 to January 1, 2023). The DP algorithm for satisfiability, directional resolution, and its relation to bucket elimination. Non-serial dynamic programming. Solving constraint satisfaction problems by variable elimination usign joins and projections.

March 2 (Thu), 2023 Midterm Exam.

March 14 (Tue), 2023 Discussion of midterm. Much Q&A. Some topics from Ch.3 [J], especially: conflicts and sensitivity to parameter (conditional probability) values.

March 16 (Thu), 2023 Dynamic Bayesian networks, repetitive temporal structures, hidden Markov models, Kalman filters. OOBNs.

March 21 (Tue), 2023 Graphical Languages for Specification of Decision Problems (ch.9 [J]) through Section 9.2.

March 23 (Thu), 2023 HW6 assigned: Exercises 3.10, 3.12, 3.13 [J] due 2023-03-30 (updated to 2023-04-04). Please see the errata (direct link at the beginning of the "Lecture Notes" page on the main course website) for the correct table to be used for exercise 3.13. Ch.9 [J] continued: decision trees.

March 28 (Tue), 2023 Ch.9 [J] continued: influence diagrams. The slides accompanying ch.9 [J] were completed. However, they cover perfect recall influence diagrams only, and the current version of Hugin uses LIMIDs, so it is likely that we will do more on IDs.

March 30 (Thu), 2023 HW6 due date changed to 2023-04-04. HW7 assignedl Exercise 9.8 [J] due 2023-04-06; this is a paper-and-pencil exercise. Ch.4 [J96] ("Propagation in Bayesian Networks") handed out in class. We go over the presentation in the handout up to the statement of Theorem 4.4.

April 4 (Tue), 2023 HW6 due date changed to 2023-04-06. Discussion of exercises 3.10 and 3.13. Ch.4 [J96] ("Propagation in Bayesian Networks") handed out in class. We go over the presentation in the handout up to the statement of Theorem 4.5.

April 6 (Thu), 2023 Ch.4 [J96] ("Propagation in Bayesian Networks") handed out in class. We conclude section 4.5 on Junction Trees. (Some proofs in the appendix may be done next time.)

April 11 (Tue), 2023 Presentation of the junction tree algorithms from Ch.4 [J96] completed with examples. (Some proofs in the appendix may be done next time.)

April 13 (Thu), 2023 The final exam will be at 1600 on April 27, 2023, in the classroom. This is the time given in the unoversity final exam schedule for Spring 2023. The syllabus on the main course website has the wrong date. Approximate belief propagation (Section 4.8 [J]); loopy belief propagation (Section 4.9 [J]). Bayesian networks as classifiers: an example using Hugin, to be completed.

April 18 (Tue), 2023 Bayesian networks as classifiers: an example using Hugin, completed. Parameter estimation and the EM algorithm (Ch.6 [J], especially 6.1.1 and 6.2).

April 20 (Tue), 2023 Final exam will be comprehensive and open book. (Note: textbook only; print it if necessary.) Learning structure: conceptual introduction to score-based and constraint-based learning; the PC algorithm (Ch.7 [J], especially 7.1). End of course.