CSCI 544: Applied Natural Language Processing — Fall 2019
|
Course objectives: Welcome! This course is designed to
introduce you to some of the problems and solutions of NLP, and their
relation to linguistics and statistics. You need to know how to
program and use common data structures.
It might also be nice—though it's not required—to have
some previous familiarity with linear algebra and probabilities.
At the end you should agree (I hope!)
that language is subtle and interesting, feel some ownership over some
of NLP's formal and statistical techniques, and be able to understand
research papers in the field.
Lectures: | WF 5:30 - 7:20pm |
Location: | WPH B27. |
Prof: | Nanyun (Violet) Peng Email: npeng@isi.edu |
TAs: | Rujun Han Email: rujunhan@isi.edu |
Graders: | Sachin Vorkady Balakrishna, Email: vorkadyb@usc.edu; Anisha Jagadeesh Prasad, Email: ajagadee@usc.edu, Jiawei Zhang, Email: zhan890@usc.edu |
Office hrs: |
Prof: Wed. 4:30pm at RTH 512; or by appt TAs: Fri. 1:00pm - 3:00pm |
Discussion site: |
Piazza
https://piazza.com/usc/fall2019/csci544/home
... public questions, discussion, announcements |
Web page: | https://violetpeng.github.io/cs544_fa19.html |
Textbook: |
Jurafsky & Martin, 3rd ed. (recommended) Manning & Schütze (recommended) |
Policies: |
Grading: homework 40%, project 20%, midterm 15% or 25%, final 15% or 25% Honesty: Viterbi integrity code, USC-Viterbi graduate policies |
Warning: The schedule below may change. Links to future lectures and assignments are just placeholders and will change.
Week | Wednesday | Friday | Suggested Reading | |
8/26 |
Introduction
|
Probability concepts
|
|
|
9/2 |
Modeling grammaticality; N-gram language models
|
Smoothing n-grams
|
|
|
9/9 |
Assignment 1 given: Probabilities Intro to neural language Models |
No class (SoCal NLP symposium)
|
|
|
9/16 |
Context-free parsing
|
Guest Lecture from TA:
|
|
|
9/23 |
Assignment 1 due Assignment 2 given: Language Models Probabilistic parsing |
Dependency Parser
|
| |
9/30 |
Semantics
|
Midterm review |
|
|
10/7 |
Midterm exam (5:30-6:30 in classroom) |
Distributional semantics (word embeddings)
|
|
|
10/14 |
Sequence tagging models
|
Project proposal due No class (fall break) |
|
|
10/21
Assignment 2 due Assignment 3 given: Semantics |
HMM, MEMM, and CRF
|
Neural sequence tagging and relation extraction
|
|
|
10/28 |
Dialog systems
|
Text classification
|
||
11/4
Assignment 3 due Assignment 4 given: Neural Sequence Tagging |
Project proposal revision (if applicable) due |
Machine Translation
|
||
11/11 |
Phrase-based machine translation
|
Sequence to sequence models
|
| |
11/18 |
NLP Applications: Creative Generation
|
Socially responsible NLP
|
|
|
11/25 |
No class (Thanksgiving break) |
Assignment 4 due No class (Thanksgiving break) |
||
12/2 |
Final Project Due Final exam recitation I |
Final exam recitation II Final exam: Wed 12/11, in class. |