January 12 (Mon), 2009 Introduction to the Course: Goals and Topics to Be Covered. The bulletin description, explained: "582 -- Bayesian Networks and Decision Graphs. {=STAT 582} (3) (Prereq: CSCE 350 and STAT 509) Normative approaches to uncertainty in artificial intelligence. Probabilistic and causal modeling with Bayesian networks and influence diagrams. Applications in decision analysis and support. Algorithms for probability update in graphical models." Examples of plausible reasoning and causal graphs: icy roads and earthquake.
January 14 (Wed), 2009 More examples of plausible reasoning and causal graphs: wet grass. Reasoning in the evidential and causal direction. Intercausal reasoning and explaining away. Uncertain reasoning is non-compositional. Serial, diverging, and converging connections. Definition of d-separation, with several examples. Complexity of checking d-separation using the naive algorithm based on the definition: list all chains, check every connection on each chain.
January 16 (Fri), 2009 Probability. The three axioms of Kolmogorov [1950]. Outcome space, event space, probability measure. Three historically important interpretations in which the axioms of Kolmogorov are true (i.e., three models of them): classical, frequentist, and subjective. Presentation of the classical approach, including sketch of the proof of the three properties that are the axioms of Kolmogorov. Brief presentation of the frequentist approach. Subjective probability defined from betting behavior: "[De Finetti] defines the probability P(E) of an event E as the fraction of a whole unit value which one would feel is the fair amount to exchange for the promise that one would receive a whole unit of value if E turns out to be true and zero units if E turns out to be false" [Neapolitan, 1990, p.56]. Similarly, subjective probability could be defined using the equivalent urn model. The probability P(E) of an event E is the fraction of red balls in an urn containing red and brown balls such that one would feel indifferent between the statement "E will occur" and "a red ball would be extracted from the urn." (I believe that this definition is due to D.V. Lindley.) Kolmogorov's axioms and the "definition" of conditional probability can be derived from the definition of subjective probability and the assumption of coherence in (betting) behavior, as explained on pp.56-58 of the handout. "Definition" of conditional probability derived in the classical approach (Theorem 2.2), with the assumption that equiprobable outcomes remain equiprobable in a new state of knowledge. Definition of conditional probability as an additional axiom of probability, which is true in the three major models of probability (classical, frequentist, subjective).
January 21 (Wed), 2009 HW1 assigned: exercises 1.7, 1.11, 1.12, 1.13 [J07], due January 28, 2008 (Wednesday). The fundamental rule, Bayes theorem, conditional independence, factorization, variables and their states (values) partition the outcome space, probability of variables, probability tables, marginalization.
January 23 (Fri), 2009 Potentials. Independence for variables. Conditional independence is symmetric. Simple evidence: a statement that a variable is in a particular state. Conditioning on evidence by setting inconsistent configurations to zero and renormalizing. Random variable, mean, variance (very briefly). Chapter 1 completed.
January 26 (Mon), 2009 Reminder: HW1 (exercises 1.7, 1.11, 1.12, 1.13 [J07]) is due January 28, 2009 (Wednesday). Chapter 2 [J07] through p.34. Causal networks. Definition of Bayesian networks according to Jensen and Nielsen.
January 28 (Wed), 2009 HW1 delayed: due January 30, 2009. Chapter 2 [J07] up to the chain rule.
January 30 (Fri), 2009 HW1 collected and correction given in class. Proof of the chain rule for Bayesian networks (version without explicit induction) from Neapolitan's definition [Neapolitan, 1990; in ppt Intro slides]. Comments on the alternative definition of BN used in [J07].
February 2 (Mon), 2009 HW2 assigned: exercises 2.1, 2.4, 2.5, 2.6, 2.7, 2.8, 2.9, 2.10, 2.12, 2.14 [J07] due Friday, February 6, 2009. Markov blanket. Determining d-separation using the moralized ancestral graph.
February 4 (Wed), 2009 HW1 returned. Variable elimination. Visit to Asia example and bucket elimination, using "Intro to BN" slides.
February 6 (Fri), 2009 HW2 collected and corrected. HW3 assigned: 2.17, 2.18, 2.19, 2.20, 2.23, due Friday, February 13, 2009. Email wang82@cse.sc.edu (Jingsong Wang) or the instructor for a link to the full version of Hugin (for which we have a license). The Hugin web site is www.hugin.com.
February 9 (Mon), 2009 HW2 returned. Building models (Ch.3 [J07]): catching the structure through insemination (3.1.3).
February 11 (Wed), 2009 Reminder: HW3 due Friday. MT1 will be on Wednesday, February 18, 2009. Section 3.1 (Catching the Structure) completed. Section 3.2 (Determining the Conditional Probabilities) through 3.2.1 (Milk Test).
February 13 (Fri), 2009 HW3 collected. Correction of HW3. Section 3.2 (Determining the Conditional Probabilities) through 3.2.3 (Poker Game).
February 16 (Mon), 2009 Reminder: MT1 will be on Wednesday, February 18, 2009. The stratum method for constructing Bayesian networks (from independence information derived from qualitative descriptions), with a detailed example (Alarm, Burglary, Earthquake, MaryCalls, JohnCalls). Section 3.2 (Determining the Conditional Probabilities) through 3.2.4 (Transmission of Symbol Strings).
February 18 (Wed), 2009 MT1.
February 20 (Fri), 2009 MT1 returned. Correction of MT1. Cold, Angina, Sore Throat, and Noisy-OR.
February 23 (Mon), 2009 HW4 assigned, due Friday, February 27: Exercises 3.3, 3,5, 3.6, 3.8, 3.9, 3.10, all from [J07]. Noisy-OR, noisy functional dependencies, divorcing, constraints (relations), modeling multiple experts of different confidence. Interventions (started).
February 25 (Wed), 2009 Interventions: excision (cut) semantics; forcing variables; example form Korb and Nicholson. Object-oriented BNs. Repetitive models: temporal BNs, Hidden Markov Models (HMMs), Kalman filters.
February 27 (Fri), 2009 HW4 collected. Example of OOBN construction in Hugin: car safety characteristics, with a (tire) grip class. Continuous variables, with a Hugin example: Cold and Angina.
March 2 (Mon), 2009 Continuous variables, with a Hugin example: Cold and Angina, completed with the addition with the Thermometer variable. Conflict measure. Chapter 4 (Belief Updating in Bayesian Networks) started: review of chain rule, potentials, domain of potentials through (almost) 4.1.1.
March 4 (Wed), 2009 HW5 assigned: 3.11, 3.12, 3.13, 3.14, 3.15, due Wednesday, March 18, 2009 HW6 assigned: 3.16, 3.19, 3.20, 3.21, 3.26, 3.27, 3.28, due Wednesday, March 25, 2009. Correction of HW4, with discussion.
March 6 (Fri), 2009 Further discussion of Exercise 3.10(ii) and (iii). See the updated HW4 correction document. Variable elimination and bucket elimination as presented in Ch.4 [J07].
March 16 (Mon), 2009 Graph-theoretic representation (4.2 [J07]), triangulated graphs (beginning of 4.2). Alternative (traditional) definition of triangulated (chordal) graph, and the following theorem: a graph is triangulated iff it has a perfect elimination ordering (Thorem 4.9 in [J95]; proof there).
March 18 (Wed), 2009 HW5 collected. Brief discussion of some difficult parts of it. Chapter 4 through Thm. 4.3 (proved) and Thm. 4.4 (stated only).
March 20 (Fri), 2009 Theorem 4.4 proved, using the maximal spanning tree approach.
March 23 (Fri), 2009 Algorithms for building join trees: maximal spanning tree approach, direct construction (section 4.3.1 [J07]), using hypergraph (just sketched). Junction trees: definition, example, and example of propagation (first part of 4.4 [J07]).
March 25 (Fri), 2009 HW6 due date changed to Wednesday, April 1, 2009. Discussion of several exercises from HW5 and the related BNs: 3.12 (all parts), 3.14 (all parts), 3.15. The numbers used in the authors' solution of 3.14(i) are the same as those computed by Mr. Stiffler: the first, second, and four entries are 0.1672, 0.0445, and 0.4659. The authors need to add a correction about this on the errata list. Junction trees example of section 4.4 [J07] completed.
March 27 (Fri), 2009 Exploiting the information scenario with lazy propagation in junction trees: barren nodes and d-separation. Triangulation.
March 30 (Mon), 2009 There will be no video for the April 1 class. Review of the algorithm for elimination of a variable in junction trees (Marginalization for Lazy Propagation). Stochastic Simulation in Bayesian Networks: Introduction and Probabilistic Logic Sampling (4.8.1 [J07]).
April 1 (Wed), 2009 HW6 collected. Stochastic Simulation in Bayesian Networks: Likelihood Weighting and Gibbs Sampling. Loopy belief propagation. Recursive conditioning (started).
April 3 (Fri), 2009 Overview of candidate papers for presentation---see lecture note transcript and video. Chapter 9 (Graphical Languages for Specification of Decision Problems) [J07] started.
April 6 (Mon), 2009 Choice of papers for presentation: Ben Fine and Nick Stiffler choose "Bayesian networks: A teacher's view." Chapter 9 continued: through Theorem 9.1 [J07].
April 8 (Wed), 2009 Chapter 9 continued: decision trees.
April 10 (Fri), 2009 Slides for Ch.9 completed. Several advanced issues in Chapter 9 were not covered.
April 13 (Mon), 2009 D-separation in influence diagrams and the chain rule for influence diagrams (first part of Section 10.1 [J07]). Value of information: Section 11.1 [J07], using the slides provided by the authors.
April 15 (Wed), 2009 First student presentation, by Wang and Langevin. See elsewhere on web site for details.
April 17 (Fri), 2009 Second student presentation, by Fine and Stiffler. See elsewhere on web site for details.
April 20 (Mon), 2009 Third and fourth student presentations on bioinformatics applications of Bayesian networks, by Cleveland and Wright. See elsewhere on web site for details.
April 22 (Wed), 2009 Student Presentation by Hite. See elsewhere on web site for details. Parameter estimation from complete and incomplete datasets; the EM algorithm. Slides for Ch.6 [J07] used, prepared by the authors.
April 24 (Fri), 2009 Student presentation by Sanderson. Learning of BNs. Scoring, number of dags as a function of their number of nodes, constraint-based learning vs. score-based learning, skeletons and v-structures (unmarried colliders), Theorem 7.1 [J07], the PC algorithm.
April 27 (Mon), 2009 Take-Home Final assigned. Variable elimination for influence diagrams: Parts of Section 10.2 [J07]. End of course.