Schedule and Syllabus

Unless otherwise specified the course lectures and meeting times are:

Tuesday, Thursday 4:30-5:50
Location: NVIDIA Auditorium
EventDateDescriptionCourse Materials
Lecture Jan 10 Intro to NLP and Deep Learning Suggested Readings:
  1. [Linear Algebra Review]
  2. [Probability Review]
  3. [Convex Optimization Review]
  4. [More Optimization (SGD) Review]
[python tutorial]
[slides]
[Lecture Notes 1]
Lecture Jan 12 Simple Word Vector Representations Suggested Readings:
  1. [Word2Vec Tutorial - The Skip-Gram Model]
  2. [Distributed Representations of Words and Phrases and their Compositionality]
  3. [Efficient Estimation of Word Representations in Vector Space]
[slides]
Spotlight: [slides] [paper]
A1 released Jan 12 Assignment #1 released [Assignment 1][Written solution]
Lecture Jan 17 Advanced Word Vector Representations Suggested Readings:
  1. [GloVe: Global Vectors for Word Representation]
  2. [Improving Distributional Similarity with Lessons Learned fromWord Embeddings]
  3. [Evaluation methods for unsupervised word embeddings]
[slides]
[Lecture Notes 2]
Spotlight: [slides] [paper]
Lecture Jan 19 Word Window Classification and Neural Networks Suggested Readings:
  1. cs231n notes on [backprop] and [network architectures]
  2. [Review of differential calculus]
  3. [Natural Language Processing (almost) from Scratch]
  4. [Learning Representations by Backpropogating Errors]
[slides]
[Lecture Notes 3]
Lecture Jan 24 Project Advice, Neural Net Details and Practical Tips Suggested Readings:
  1. [Vector, Matrix, and Tensor Derivatives]
  2. Section 4 of [A Primer on Neural Network Models for Natural Language Processing]
[slides]
Spotlight: [slides] [paper]
Lecture Jan 26 Dependency Parsing Suggested Readings:
  1. Joakim Nivre. 2004. Incrementality in Deterministic Dependency Parsing. Workshop on Incremental Parsing.
  2. Danqi Chen and Christopher D. Manning. 2014. A Fast and Accurate Dependency Parser using Neural Networks. EMNLP 2014.
  3. Sandra K├╝bler, Ryan McDonald, Joakim Nivre. 2009. Dependency Parsing. Morgan and Claypool. [Free access from Stanford campus, only!]
  4. Daniel Andor, Chris Alberti, David Weiss, Aliaksei Severyn, Alessandro Presta, Kuzman Ganchev, Slav Petrov, and Michael Collins. 2016. Globally Normalized Transition-Based Neural Networks. ACL 2016.
  5. Marie-Catherine de Marneffe, Timothy Dozat, Natalia Silveira, Katri Haverinen, Filip Ginter, Joakim Nivre, and Christopher D. Manning. 2014. Universal Stanford Dependencies: A cross-linguistic typology. Proceedings of the Ninth International Conference on Language Resources and Evaluation (LREC-2014). Revised version for UD v1.
  6. Universal Dependencies website
[slides]
[Lecture Notes 4]
Spotlight: [slides] [paper]
A1 Due Jan 26 Assignment #1 due
A2 Released Jan 26 Assignment #2 released [Assignment 2][Written solution]
Lecture Jan 31 Introduction to TensorFlow Suggested Readings:
  1. [TensorFlow Basic Usage]
[slides]
[Lecture Notes Tensorflow]
Spotlight: [slides] [paper]
Lecture Feb 2 Recurrent Neural Networks and Language Modeling [slides]
[vanishing grad example] [vanishing grad notebook]
Spotlight: [slides] [paper]
Lecture Feb 7 RNNs/LSTMs/GRUs [slides]
[Lecture Notes 5]
Spotlight: [slides] [paper 1] [paper 2] [paper 3]
Review Feb 9 Midterm Review [slides]
Project Proposal Due Feb 9 Final project proposal due [Project page]
A2 Due Feb 9 Assignment #2 due
A3 Released Feb 13 Assignment #3 released [Assignment 3]
Midterm Feb 14 In-class midterm [Gradient Computation Notes]
Practice midterms: [Midterm 1] [Midterm 2] [Midterm 1 Solutions] [Midterm 2 Solutions]
Lecture Feb 16 Neural Machine Translation: Encoder-Decoder RNNs and Attention Mechanisms Suggested Readings:
  1. [Sequence to Sequence Learning with Neural Networks]
  2. [Neural Machine Translation by Jointly Learning to Align and Translate]
  3. [Effective Approaches to Attention-based Neural Machine Translation]
[slides]
Spotlight: [slides] [paper]
Lecture Feb 21 Gating in RNNs revisited/MT evaluation/Recent improvements to NMT Suggested Readings:
  1. [On Using Very Large Target Vocabulary for Neural Machine Translation]
  2. [Pointing the Unknown Words]
  3. [Neural Machine Translation of Rare Words with Subword Units]
  4. [Achieving Open Vocabulary Neural Machine Translation with Hybrid Word-Character Models]
[slides]
Lecture Feb 23 Speech Processing
A3 Due Feb 25 Assignment #3 due
A4 Released Feb 25 Assignment #4 released Default final project [Assignment 4]
Lecture Feb 28 Convolutional Neural Networks Suggested Readings:
  1. [A Convolutional Neural Network for Modelling Sentences]
  2. [Convolutional Neural Networks for Sentence Classification]
Lecture Mar 2 Tree Recursive Neural Networks and Constituency Parsing Suggested Readings:
  1. [Parsing with Compositional Vector Grammars]
  2. [Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank]
  3. [Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks]
Lecture Mar 7 Coreference Resolution
Lecture Mar 9 Natural Language Understanding
Lecture Mar 14 Question Answering
Lecture Mar 16 Dynamic Memory Networks and Other New Architectures
Final Project Due Mar 17 Final course project / Assignment #4 due
Poster Presentation Mar 21 Final project poster presentations 12:15-3:15, location TBD