Description

This course will provide an overview of the key ideas that underlie the Bayesian approach to modeling data, with a particular focus on text. The course will consist of discussing, deriving, and implementing a number of Bayesian models of text (and their associated inference algorithms) in order to understand their fundamental strengths and weaknesses, as well as explore the relationships between them. The aim of the course is to develop the knowledge and skills needed to design, implement, and apply such models to real-world data. Students entering the course should have good programming skills, knowledge of algorithms, knowledge of probability, statistics, or machine learning, and a strong interest in text analysis. To facilitate productive discussion, students with diverse research backgrounds and interests are especially encouraged to participate.

General Information

  • Instructor: Hanna M. Wallach (wallach at cs umass edu)
  • Instructor Office Hours: By appointment only
  • Lectures: Wednesdays 3:30pm to 5:30pm, CS 140

Tentative Schedule

  • Sep. 05: Introduction, probabilistic modeling
  • Sep. 12: Beta--binomial unigram language model
  • Sep. 19: Dirichlet--multinomial unigram language model
  • Sep. 26: Dirichlet--multinomial mixture model: known groups
  • Oct. 03: Dirichlet--multinomial mixture model: Gibbs sampling 1
  • Oct. 10: Dirichlet--multinomial mixture model: Gibbs sampling 2
  • Oct. 17: No class
  • Oct. 24: Latent Dirichlet allocation: Gibbs sampling
  • Oct. 31: Hyperparameter inference: slice sampling
  • Nov. 07: No class
  • Nov. 14: Hyperparameter optimization
  • Nov. 21: No class (Thanksgiving)
  • Nov. 28: Variational inference
  • Dec. 05: No class

Homeworks

Homeworks are due at the start of class on the day indicated. Completed Python files must be uploaded to an appropriately-named subdirectory (e.g., hw1) of the cs691bm directory in your home directory on EdLab. Written materials must be submitted on paper at the start of class or uploaded in PDF format to the appropriate directory on EdLab. Late homeworks will not be accepted.

  • Sep. 05: Sampling from a discrete distribution (due Wed. Sep. 12) [tar.gz]
  • Sep. 12: Beta--binomial (due Wed. Sep. 19) [tar.gz]
  • Sep. 19: Dirichlet--multinomial (due Wed. Sep. 26) [tar.gz]
  • Sep. 26: Dirichlet--multinomial mixture model: known groups (due Wed. Oct. 03) [tar.gz]
  • Oct. 03: Dirichlet--multinomial mixture model: Gibbs sampling 1 (due Wed. Oct. 10) [tar.gz]
  • Oct. 10: Dirichlet--multinomial mixture model: Gibbs sampling 2 (due Wed. Oct. 24) [tar.gz]
  • Oct. 24: Latent Dirichlet allocation: Gibbs sampling (due Wed. Oct. 31) [tar.gz]
  • Oct. 31: Hyperparameter inference: slice sampling (due Wed. Nov. 07) [tar.gz]
  • Nov. 14: Hyperparameter optimization (due Wed. Nov. 28) [tar.gz]

Paper Critique

Paper critiques are due at 11:59pm on Fri. Dec. 7. Completed critiques must be uploaded (in PDF format; no more than 4 pages) to an appropriately-named subdirectory (i.e., critique) of the cs691bm directory in your home directory on EdLab. Late critiques will not be accepted.

  • Oct. 24: Paper critique (due Fri. Dec. 7 at 11:59pm) [txt]

Grade Breakdown

  • Homeworks (40%): There will be ten homeworks (0-2 points each). Together, these will count for 40% of your grade. Homeworks will be graded on both correctness and clarity.
  • Paper Critique (20%): You must critique an existing paper on Bayesian methods for text, selected from an instructor-provided list. This will count for 20% of your grade. Your critique should consist of a summary/explanation of the key ideas, followed by detailed comments regarding the pros and cons of the approach (with justifications) and questions/comments/thoughts about the work, etc. You should concentrate on the content (i.e., the problem, ideas, evaluation methodology, etc.). You should not comment on grammar or typographical errors.
  • Final Exam (40%): The final exam will count for 40% of your grade.

In accordance with UMass policy, incomplete (INC) grades will only be given when (documented) severe physical/mental medical reasons have prevented the completion of course requirements.

Academic Honesty Policy

The amount of outside help permitted depends on the course component:

  • Homeworks: You may discuss the homeworks with other students—in fact, I encourage this as a learning experience. However, your writeup and code (if appropriate) must be your own. Copying is not permitted; neither is collaboration so close that it looks like copying. If I receive two identical homeworks, I will accept neither of them (i.e., both students will receive 0 for that homework) and I will pursue formal action. A good practice is to divide your work into an "ideas phase" where you collaborate, followed by a "writeup/implementation phase" where you work alone. You should enter the latter phase with notes, but not problem solutions or code. You must write a list of all your collaborators at the top of each assignment. This list should include anyone with whom you have discussed the assignment. If you make use of any printed or online sources while working on an assignment (other than instructor-provided course materials), these must also be listed. Copying solutions from the web is cheating and is surprisingly easy to detect.
  • Final Exam: No outside help is permitted. Any cheating is grounds for an F in the course.

For more information, please see the Dean of Students' Academic Honesty Policy.