CSCE 582 {=STAT 582} (Spring 2021) Lecture Log

January 11 (Tue), 2022 Introduction to the Course: syllabus, objectives, course structure, and topics to be covered. Bayesian networks and decision graphs as a topic area with artificial intelligence. Example of plausible reasoning and causal graphs: icy roads.

January 13 (Tue), 2022 Examples of plausible reasoning and causal graphs: icy roads, wet grass, earthquake. Reasoning in the evidential and causal direction. Intercausal reasoning and explaining away. Causal networks and d-separation through most of section 2.2 [J].

January 18 (Tue), 2022 Computation of certainty factors in a compositional system. Problems with compositional system. The Chernobyl example with Naive Bayes and five-variable causal networks; the importance of structure in a causal model. A first look at Hugin. Classical and subjective models of probability: Ch.1 [J] started.

January 20 (Thu), 2022 HW1 assigned: exercises 1.11, 1.12, and 1.13, due 2022-02-01 (note date change). The (Kolmogorov) axioms of probability. Proof that the Classical and subjective interpretations of probability are models of the axioms. Conditional probability as a fourth axiom of probability. The fundamental rule and Bayes' rule.

January 25 (Tue), 2022 HWi due date changed (see previous item). Probability prerequisites; potentials; an algebra of potentials (Ch.1 [J]).

January 27 (Thu), 2022 Definition of Bayesian networks, accoring to [Neapolitan, 1990]. The local Markov condition. Proof of the chain rule for Bayesian networks from Neapolitan's definition. The visit to Asia example.

February 1 (Tue), 2022 Three computational problems. Detailed example of bucket elimination for solving the belief update problem, almost completed.

February 3 (Thu), 2022 HW2: Do exercises 2.1 through 2.11 [J], due on Thursday, 2022-02-15. Detailed example of bucket elimination for solving the belief update problem, completed. Review of some properties of the algebra of potentials. Review of d-separation. Review of the ",e" notation.

February 8 (Tue), 2022 HW2: Do exercises 2.1 through 2.11 [J], due on Thursday, 2022-02-15. (Note new due date.) Variable elimination. Munin. Probabilistic graphical models and their advantages. Ch.2 [J] slides from the authors completed. Detailed example of MPE (Most Probable Explanation) computation.

February 10 (Thu), 2022 Definition of Markov blanket. Lauritzen's algorithm for d-separation: ancestral sets, moralization, the global Markov condition. Equivalence of factorization, the global Markov condition, and the local Markov condition: theorem 3.7 in Laurizen's _Graphical Models_. Theorem 2.1 [J] (the chain rule for Bayesian networks), sketched.

February 15 (Tue), 2022 Valuation-based systems as described in Section 5.5 [J]. NSDP (using instructor's notes).

February 17 (Thu), 2022 HW3: Exercises 2.12-2.14 and 2.17-2.19 [J], due 2022-03-01. More on NSDP Complexity aspects of elimination orderings. Other computational problems that can be solved by variable elimination: constraint satisfaction with relations (using join and projection), propositional satisfiability using the Davis-Putnam algorithm.

February 22 (Tue), 2022 Discussion of some HW2 exercises, especially 2.8, 2.9, and 2.10 [J]. Ch.3 [J] started. Hypothesis and information variables. Confusion matrices (true positive, false positive, true negative, false negatives). Derived measures (as defined in Jiri Vomlel's "Probabilistic reasoning with uncertain evidence," linked on course website).

February 24 (Thu), 2022 "Catching the Structure" (Section 3.2 [J]) completed.

March 1 (Tue), 2022 The stratum method. Determining the conditional probabilities (section 3.2 [J]) started: section 3.2.1 [J].

March 3 (Tue), 2022 Midterm will be on Thursday, March 17, 2022. HW4 assigned: exercises 2.21, 2.22, 2.23 [J], due on March 15, 2022. For exercise 2.22, please use only four variables: Earthquake?, Burglary?, Alarm?, RadioReport?.

March 15 (Tue), 2022 First meeting after spring break. Midterm date confirmed. Some discussion of HW4 exercises, using Hugin. Computation of joint probability using the "joint configuration" analysis tool of Hugin. Computation of the joint probability of two variables by adding a child variable whose state space is the cross-product of the state spaces of the two parent variables and whose CPT has only zeros and ones. Building models: the stud farm example; the simple poker game example; the faulty transmission channel example.

March 17 (Thu), 2022 Midterm Exam.

March 22 (Tue), 2022 HW5 assigned: exercises 3.3, 3.5, 3.6, 3.8, and 3.9 [J], due 2022-03-31. Note: 3.8 was added on 2022-03-29; it had been left out by mistake in the original assignment. Please use Hugin for all exercises in HW5. Discussion of exercise 3.3. Correction of midterm with some extensions: Athenian taxis example, multiple witnesses, collusion among witnesses. Mimicking.

March 24 (Thu), 2022 Discussion of HW4. Ch.3 [J] through Section 3.3.4.

March 29 (Tue), 2022 Reminder: HW5 assigned: exercises 3.3, 3.5, 3.6, 3.8, and 3.9 [J], due 2022-03-31. Note: 3.8 was added on 2022-03-29; it had been left out by mistake in the original assignment. Ch.3 [J] completed. OOBNs not covered.

March 31 (Thu), 2022 HW6 assigned: exercises 3.10, 3.12, 3.13 [J], due on 2021-03-18 (Thursday). Dynamic BNs (Section 3.3.7 [J]). OOBNs (Section 3.3.6 [J]), with an example using Hugin. Batch evaluation of classifiers and classifier performance, using Hugin (started).

April 5 (Tue), 2022 Batch evaluation of classifiers and classifier performance, using Hugin (completed). Probability propagation: the junction tree algorithm, following [J95] (started).

April 7 (Tue), 2022 Discussion of HW6. Probability propagation: the junction tree algorithm, following [J95] (continued).

April 12 (Tue), 2022 More information on the work for graduate student is available in dropbox. The choice of extra work needs to be submitted on dropbox by Friday, April 15, at midnight. HW7 assigned: exercises 3.16 (use Hugin!), 3.21 (second part only; you must estimate the probabilities for the tables using the Noisy-Or assumption), 3.7 (imagine a patient for the "self diagnosis"), 3.29 (for all parts after (i), use the new potentials). Further discussion of HW6. The tables for exercise 3.13 (ii( seem to be such that use of MPE (Max-propagation) vs. Sum-propagation does not change the result. Probability propagation: the junction tree algorithm, following [J95], completed; a summarizing example usinf the Chest Clinic example will be done next time.

April 14 (Thu), 2022 The final exam will be a take home exam. This email message was sent to all students after class: "As announced in today's class, the final exam will be a take-home exam. The exam will be posted in the departmental dropbox, and you will submit your answer there, in the same way as for homework. Further details will be announced in later classes and/or via email at a later date." Exercise 3.12 (iii) discussed in class. (This part will not be graded for HW6.) Probability propagation: constructing a junction tree for the Chest Clinic example. Stochastic Simulation in Bayesian networks: Section 4.8 [J].

April 19 (Tue), 2022 Stochastic Simulation in Bayesian Networks: Gibbs Sampling (Section 4.8 [J] completed). Decision Graphs (Ch.9 [J]) started, thorough Section 9.2 [J]. Discussion of the exercises for HW7.

April 21 (Thu), 2022 Reminder: The final exam will be a take-home exam. The exam will be posted in the departmental dropbox. The exam will be comprehensive. It will be likely be posted on Saturday, April 23 with a due date of Friday, April 29. I share with the students an email of 2022-04-20 from Thomas Nielsen that acknowlegdes that the author's solution of exercise 3.13 is incorrect and thanks the students for finding the error. Here is an excerpt: "As for the solution, then I am afraid that you are (again) correct. I have recalculated the probabilities for T4|T3 are I also get numbers close to 0.5 (see attached model). Please check whether they match your results. This, unfortunately, also means that the nice solution feature where the MPE configuration differs from the one obtained by sum-propagation no longer holds. If you replace Table 3.10 with the table below, you should get the correct results. [Table omitted.] Thanks a lot for pointing out this problem out (and also thanks to your students for the effort). I will update the errata sheet with the table above." Decision Graphs (Ch.9 [J]) completed, using slides from the authors of the textbook. End of course.