CS 188: Natural Language Processing — Winter 2022
|
Course objectives: Welcome! This course is designed to
introduce you to some of the problems and solutions of NLP, and their
relation to machine learning, statistics, linguistics, and social sciences.
You need to know how to program and use common data structures.
It might also be nice—though it's not required—to have
some previous familiarity with linear algebra and probabilities.
At the end you should agree (I hope!) that language is subtle and interesting, feel some ownership over some
of NLP's techniques, and be able to understand research papers in the field.
Lectures: | M/W 12:00 - 1:50pm |
Location: | Royce Hall 190. |
Prof: | Nanyun (Violet) Peng Email: violetpeng@cs.ucla.edu |
TAs: | Te-Lin Wu Email: telinwu@g.ucla.edu Mingyu Derek Ma Email: ma@cs.ucla.edu |
Office hrs: |
Prof: Mon. 11:00am at Eng VI 397A; or zoom: link TAs: Wu: Thr. 1:00pm - 2:00pm at Eng VI 389, or zoom: link Ma: Tue. 1:00pm - 2:00pm at Eng VI 389, or zoom: link |
TA sessions: |
Wu: Fri. 12:00pm - 1:50pm at Renee and David Kaplan Hall 169, or zoom: link Ma: Fri. 2:00pm - 3:50pm at Royce Hall 190, or zoom: link |
Discussion site: |
Piazza
https://piazza.com/class/kxs66p57qcpw5 ... public questions, discussion, announcements |
Web page: | https://vnpeng.net/cs188_win22.html |
Textbook: |
Jurafsky & Martin, 3rd ed. (recommended) Manning & Schütze (recommended) |
Policies: |
Grading: homework 30%, project 20%, midterm 20%, final 25%, participation 5% Honesty: UCLA Student Conduct Code |
Warning: The schedule below may change. Links to future lectures and assignments are just placeholders and will not be available until shortly before or after the actual lecture.
Week | Monday | Wednesday | Friday (TA sessions) | Suggested Reading |
1/3 |
Introduction
|
Text classification and lexical semantics
|
|
|
1/10 |
Project description out Distributional semantics |
N-gram language models
|
|
|
1/17 |
Assignment 1 release No lecture (MLK holiday) |
N-gram language models (cont.) |
|
|
1/24 |
Smoothing n-grams
|
Intro to neural language models
RNN language models
|
|
|
1/31 |
Assignment 1 due Transformers and Masked Language Models |
Project midterm report due Syntax |
Assignment 1 answer keys release |
|
2/7 |
Midterm exam (12:00-1:50pm in class) Return assignment 1 gradings |
Sequence tagging models
|
Return project feedbacks |
|
2/14 |
Assignment 2 release Sequence tagging models (cont.) |
Named Entity Recognition
|
Return midterm gradings |
|
2/21 | No lecture (Presidents' Day) |
Probabilistic parsing
|
|
|
2/28 |
Assignment 2 due Dependency Parser |
Dependency Parser (Cont.)
|
|
|
3/7 |
Return ssignment 2 gradings Intro to Machine Translation |
Numerical Examples for classification and language models
|
Project final report due |
|