Instruction (course introduction, overview of NLG history and basic techniques) |
||
W1 |
Course overview, Intro to NLG, N-gram language models |
Smoothing, log-linear language models, neural networks basics. |
W2 |
Neural language models, MT and Sequence-to-Sequence models (conditional LMs) |
Decoding methods (beam search, sampling) |
Paper Presentations (3 weeks for methodology, 3 weeks for applications. Each class we present two papers on one topic.) |
||
W3 |
Topic: Sequence to sequence models · Neural Machine Translation by Jointly Learning to Align and Translate. https://arxiv.org/abs/1409.0473. Suggested supplementary readings: Sequence to Sequence Learning with Neural Networks http://papers.nips.cc/paper/5346-sequence-to-sequence-learning-with-neural-networks.pdf. · Attention is All You Need https://arxiv.org/abs/1706.03762
|
Topic: Autoregressive language models · Breaking the Softmax Bottleneck: A High-Rank RNN Language Model https://arxiv.org/abs/1711.03953 · XLNet: Generalized Autoregressive Pretraining for Language Understanding https://arxiv.org/abs/1906.08237. Suggested supplementary readings: GPT-2: Language Models are Unsupervised Multitask Learners https://www.techbooky.com/wp-content/uploads/2019/02/Better-Language-Models-and-Their-Implications.pdf
|
W4 |
Topic: VAE-based generation · Auto-regressive Decoding: Generating Sentences from a Continuous Space. https://arxiv.org/abs/1511.06349 · Avoiding Latent Variable Collapse with Generative Skip Models. https://arxiv.org/pdf/1807.04863.pdf
|
Topic: Generative Adversarial Networks (GAN) for text · SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient https://www.aaai.org/Conferences/AAAI/2017/PreliminaryPapers/12-Yu-L-14344.pdf. · Language GANs Falling Short: https://arxiv.org/abs/1811.02549. Suggested supplementary readings: Evaluating Text GANs as Language Models. https://www.aclweb.org/anthology/N19-1233.pdf
|
W5 |
Topic: Insertion-based generation · Insertion Transformer: https://arxiv.org/abs/1902.03249. Suggested supplementary readings: Enabling Language Models to Fill in the Blanks. https://nlp.stanford.edu/pubs/donahue2020infilling.pdf · Non-monotonic Sequential Text Generation: https://arxiv.org/pdf/1902.02192.pdf
|
Topic: Controlled generation · Toward Controlled Generation of Text https://arxiv.org/abs/1703.00955 · Posterior Control of Blackbox Generation https://arxiv.org/pdf/2005.04560.pdf
|
W6 |
Topic: Summarization · Pointer Generator Network: https://arxiv.org/abs/1704.04368 · Text Summarization with BERT: https://arxiv.org/pdf/1908.08345.pdf
|
Topic: Machine Translation · Dual Learning for Machine Translation. https://papers.nips.cc/paper/6469-dual-learning-for-machine-translation.pdf · BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension https://arxiv.org/abs/1910.13461
|
W7 |
Topic: Dialog system · Personalizing Dialogue Agents: I have a dog, do you have pets too? https://arxiv.org/abs/1801.07243. Suggested reading: How NOT To Evaluate Your Dialogue System. https://arxiv.org/abs/1603.08023 · Task-Oriented Dialogue as Dataflow Synthesis: https://arxiv.org/abs/2009.11423
|
Topic: Story generation · Hierarchical Neural Story Generation: https://arxiv.org/abs/1805.04833 · Content Planning for Neural Story Generation with Aristotelian Rescoring: https://arxiv.org/abs/2009.09870. Suggested reading: Plan-and-Write: https://arxiv.org/abs/1811.05701 |
W8 |
Topic: Figurative language generation · Sarcasm generation: https://arxiv.org/abs/2004.13248 · Simile generation: https://arxiv.org/abs/2009.08942 |
Topic: Poetry generation · Chinese Poetry Generation with Recurrent Neural Networks: https://www.aclweb.org/anthology/D14-1074.pdf · Generating Topical Poetry: https://www.aclweb.org/anthology/D16-1126/
|
Final Presentation (each team have 20 mins to present their project) |
||
W9 |
TBA |
TBA |
W10 |
TBA |
TBA |