[Back to CS585 home page]
Schedule
Make sure to reload this page to ensure you’re seeing the latest version.
Readings should be done before the indicated class.
Most slides have keynote format available if you change .pdf to .key in the url.
“JM” sometimes refers to 2nd edition, sometimes 3rd edition, of the Jurafsky and Martin text.
- HW0 is out! Due Thursday.
Th 9/10 - Probability and Naive Bayes [slides pdf]
- HW0 is due as a hard copy, in class, at the start of lecture.
- In-class exercise #1 on NB. We did not get to the bonus question.
Readings:
- MacKay 2.1-2.3 (Probability)
- JM 3rd ed. 7.1 (Naive Bayes)
Optional resources:
Weekend assignment: install Python and get comfortable with it.
Tu 9/15 - More Classification [slides pdf]
HW1 is out. Due Friday Sept 25. Files:
Readings for today:
- JM 3rd ed. 7.2-7.4 (Classification)
Python demo notebook HTML from lecture. (Also ipynb format)
Th 9/17 - Logistic Regression [slides pdf]
Readings:
Tu 9/22 - Guest Lecture by Gaja Jarosz
[slides]
Th 9/24 - Part-of-speech tags [slides pdf]
Readings:
F 9/25: HW1 is due.
Tu 9/29 - HMM and Viterbi
Lecture notes on HMM and Viterbi
Readings:
- JM 3rd ed. 8.1-8.2, 8.4, 9.4
Th 10/1 - Discriminative sequence models, part 1 [slides pdf]
Log-linear models, Conditional Random Fields (which are a type of log-linear model).
In-class exercise 3: additive Viterbi
Readings:
HW2. Due Oct. 13 at midnight.
Tu 10/6 - Discriminative sequence models, part 2 [slides pdf]
Review Viterbi, and do Structured Perceptron learning.
Scan of in-class lecture scribblings
Th 10/8 - Projects discussion [slides]
Also board photos and notes on Viterbi, factor scores as a graph, and Problem 3
No class Tu 10/13
Tue 10/13: HW2 due
Th 10/15 - Midterm Review
Tu 10/20 - In-class Midterm
Th 10/22 - Edit distance and the noisy channel [slides]
Readings:
- JM 3ed, Chapter 6, “Spelling Correction and the Noisy Channel”. Plus a little bit of background for it:
- JM 3ed, Chapter 4 start, and 4.1: N-Gram LMs. (Also was in Gaja’s lecture.)
- JM 3ed, 2.4, “Minimum Edit Distance”. (May be helpful to skim earlier parts of chapter 2 for some context.)
- Optional but delightful: Norvig, How to write a spelling corrector. Implements a basic noisy channel spelling corrector in 21 lines of python.
Tu 10/27: Machine Translation, Part 1 [slides]
Readings:
- JM 2nd edition, 25.1–25.5
Th 10/29: Machine Translation, Part 2 [slides]
Readings:
HW3: due Friday 11/6 (at any time, or later that night)
Tu 11/3: Human Evaluation (and finish MT)
Readings:
- JM 3ed, 7.2-7.3: “Evaluation: Prec/Rec/F and Stat Sig Testing”
- JM 2ed, 25.9: “MT Eval”
Th 11/5: Sigtesting and Parsing, part 1
Readings:
- JM 2ed, 12.1-12.7 “Formal Grammars of English”
Tu 11/10: Parsing, part 2
Readings:
- JM 2ed, Ch 13, “Parsing with CFGs”
Th 11/12: Parsing, part 3
Reading is Ch 13 again.
Tu 11/17: Dependencies and Coreference, Part 1
Reading:
Th 11/19: Coreference, Part 2
HW4 coreference: hw4.pdf and hw4.zip. Due Friday 12/4 (at any time, or later that night)
Tu 11/24: Lexical semantics [slides]
Suggested reading: JM 2ed, Chapters 19 and 20
(11/26: Thanksgiving)
Tu 12/1: Distributional semantics [slides]
Suggested reading: Turney and Pantel 2010, “From Frequency to Meaning: Vector Space Models of Semantics”
HW5 on distributional similarity
Th 12/3: Topic models and neural networks in NLP
Tu 12/8 and Th 12/10: Final presentations
See the projects page for details.