CS 685, Spring 2022, UMass Amherst CS
Mon/Wed 2:30-4 PM in Herter 227 (masks required!)
This class will also be livestreamed on Youtube! See schedule for all videos, readings, and assignments.
Instructor: Mohit Iyyer
TAs: Andrew Drozdov, Shufan Wang
Email (to all of us): firstname.lastname@example.org
Office hours (US Eastern time), see Piazza for meeting links:
Natural Language Processing (NLP) is the engineering art and science of how to teach computers to understand human language. NLP is a type of artificial intelligence technology, and it's now ubiquitous -- NLP lets us talk to our phones, use the web to answer questions, map out discussions in books and social media, and even translate between human languages. Since language is rich, ambiguous, and very difficult for computers to understand, these systems can sometimes seem like magic -- but these are engineering problems we can tackle with data, math, and insights from linguistics.
This course will broadly focus on deep learning methods for natural language processing. Most of the semester will focus on neural language models and transfer learning, both of which have significantly pushed forward the state of the art. It is intended for graduate students in computer science and linguistics who are (1) interested in learning about cutting-edge research progress in NLP and (2) familiar with machine learning fundamentals. We will cover modeling architectures, training objectives, and downstream tasks (e.g., text classification, question answering, and text generation). Coursework includes reading recent research papers, programming assignments, and a final project. While this is an in-person class, all lectures will be livestreamed on YouTube, and video links will be posted on the course schedule.
A nice textbook for NLP fundamentals is Jurafsky and Martin, Speech and Language Processing, 3rd ed. For this course, readings will mainly be NLP conference papers (e.g., from ACL, NAACL, and EMNLP). We will post all readings as PDFs.
Other useful texts for NLP include: