CMPSCI 383: Artificial Intelligence

Fall 2014 (archived)

Exam 2 Review

The second exam will be given on Thursday, 30 October in AEBN 119 at 1900. The exam will cover material discussed in lectures from September 25 (the first lecture on quantifying uncertainty) to October 28 (the second lecture on machine learning) and the corresponding chapters of Russell & Norvig listed on the schedule, with some reference to topics from the prior exam that are relevant to machine learning (e.g., search).

The questions below cover the topics that may appear on the exam. You should be able to answer these questions. You should also be able to use the topics they cover in combination and to apply them to specific problems.

Working end-of-chapter problems in Chapters 13 and 14 may be particularly helpful.

Probability

  • What are atomic events? What is the relation between atomic events and the aspects of the work over which an agent is uncertain?
  • What is a joint probability distribution? What is a conditional probability distribution?
  • What is the product rule? What is the chain rule? (Note: on the exam, I will provide you with the equations for whatever calculations you’ll need, but you should understand what the equations represent.)
  • How can you use a joint probability table and enumeration to derive smaller joint distributions or conditional distributions?
  • What is the condition that defines complete independence? What is the condition that defines conditional independence?
  • What is Bayes rule? How can you derive it from the product rule?

Bayes nets

  • What is marginalization? What is conditioning?
  • Why are Bayes nets a preferable representation to a full joint probability table?
  • What are the elements of a Bayesian network?
  • What conditional independence assumptions are encoded by a Bayesian network?

Exact and approximate inference in Bayes nets

  • How does inference by enumeration function? What are its problems?
  • How does sampling function? What potential problems can it encounter? What are some solutions to those problems?
  • How does direct sampling function? What potential problems can it encounter? What are some solutions to those problems?
  • How does rejection sampling function? What potential problems can it encounter? What are some solutions to those problems?
  • How does importance sampling (specifically: likelihood weighting) function? What potential problems can it encounter? What are some solutions to those problems?
  • How does MCMC (specifically: Gibbs sampling) function? What potential problems can it encounter? What are some solutions to those problems?

Introductory Machine Learning

  • What are the steps common to most inductive learning techniques?
  • What is a contingency table and how does it relate to a conditional and joint distribution?
  • What does the chi-square statistic calculate?
  • What is the simplest way to learn joint or conditional probability tables, given a Bayes net’s structure (but no CPTs) and a corresponding data set?

Rule learning

  • In the context of learning conjunctive logical rules, what are generalization and specialization?
  • If you generalize a rule, what effect does that have on the frequencies in a contingency table? What effect does specialization have?