Instructor: Seth Pate, first and only TA: Liam Pavlovic
Monday & Wednesday, Dec 5 & 7 - Presentations
Held in our normal classroom.
Thursday, Dec 1 - Robot Talent Show
Held in ISEC.
Wednesday, Nov 30 - Fairness, Bias, Toxicity
culture lab due
Monday, Nov 28 - Robot Lab VIII
Last day for help with your robots!
Wednesday & Thursday
Happy thanksgiving!
Monday, Nov 21 - Question Answering
References:
- SQuAD - Question Answer dataset
- BIDAF - LSTM Q/A System
- Reading Wikipedia to Answer Open-Domain Questions
- Latent Retrieval for Weakly Supervised Open-Domain Question Answering
- Dense Passage Retrieval for Open-Domain Question Answering
- Learning Dense Representations of Phrases at Scale
Thursday, Nov 17 - Open Lab
Class optional, TA will be in the lab to answer questions on projects.
Wednesday, Nov 16 - Research Lab Day
Held in ISEC.
Monday, Nov 14 - Robot Lab VII
Held in ISEC.
Thursday, Nov 10 - Robot Lab VI
Class will be held in ISEC.
Wednesday, Nov 9 - Mathematical Concepts in NLP
Guest lecture by Liam Pavlovic.
Monday, Nov 7 - Midterm Research Presentations
Thursday, Nov 3 - Robot Lab V
Class will be held in ISEC.
Wednesday, Nov 2 - Pretraining and Transformers in the Field
Monday, Oct 31 - Transformers and Pretraining
We’ll finish our material from the Wednesday Oct 26th lecture, then move on to pretraining.
Thursday, Oct 27 - Robot Lab IV
Class will be held in ISEC.
Wednesday, Oct 26 - Transformers
Recommended:
- Attention is All You Need - original transformer paper
- The Illustrated Transformer - helpful diagrams and animations
- Layer Normalization - a key part of transformer models
- Image Transformer - using transformers for stuff other than text
Monday, Oct 24 - Attention and Pytorch
- cs224 - notes
- cs224 - slides
- cs224 - lecture The first half hour or so of the lecture is a good description of attention. The rest of the lecture is also full of good advice for research projects, most of which applies to your own projects.
If we have time, we’ll start Transformers.
lab 4 (nmt) due
Thursday, Oct 20 - Robot Lab III
Class will be held in ISEC.
Wednesday, Oct 19 - Seq2Seq and Attention
Monday, Oct 17 - Simple and LSTM RNNs
These notes cover both this unit and the previous (language models, RNNs). Try to cover the material presented in the slides below.
Thursday, Oct 13 - Robot Lab II
Class will be held in ISEC.
Wednesday, Oct 12 - Recurrent NNs and Language Models
These notes cover both this unit and the next (vanishing gradients, seq2seq). Try to cover the material presented in the slides below.
Recommended, not required:
- n-gram language models – This is an older method that may help you understand how language modeling works in general.
- the unreasonable effectiveness of RNNs – another hit from Andrej, describing character level RNNs and how good they are (especially for the time, 2015).
- sequence modeling – another way of understanding this material, which you may find helpful.
- on chomsky and the two cultures of statistical learning – a bit of inside baseball here, but chomsky is a big figure in NLP, and you may want to understand some context.
Thursday, Oct 6 - Basic Robotics
Guest lecture from Lawson Wong, in our normal room.
Wednesday, Oct 5 - Robot Lab
Please meet in ISEC on the 5th floor lab, a little bit early if you can. Review the robot lab ahead of time.
Monday, Oct 3 - Research Proposal, Backprop, torch.nn
Please review these:
Thu, Sep 29 - Neural Networks
Recommended:
- matrix calculus notes
- Review of differential calculus
- cs231n notes on network architecture
- cs231n notes on backprop
- Derivatives, Backpropagation, and Vectorization
- Learning Representations by Backpropagating Errors
- Yes you should understand backprop
- NLP (Almost) from Scratch
Wed, Sep 28 - Gary Marcus Lecture
Starting at 1pm.
Mon, Sep 26 - Word Vectors and Neural Classifiers
- cs224 - notes
- cs224 - slides
- cs224 - lecture
- recommended: GloVe: Global Vectors for Word Representation
- recommended: Improving Distributional Similarity with Lessons Learned from Word Embeddings
- recommended: Evaluation methods for unsupervised word embeddings
Thu, Sep 22 - Word Vectors (cont)
See readings for Wed, Sep 21.
Wed, Sep 21 - Word Vectors
- cs224 - notes
- cs224 - slides
- cs224 - lecture
- recommended: Efficent Estimation of Word Representations in Vector Space
- recommended: Distributed Representations of Words and Phrases and their Compositionality
- assigned: lab 2 - word vectors
Mon, Sep 19 - Linear Algebra, Matrix Calculus
- cs229 - notes on linear algebra: Please read section 4, although you are welcome to skim the discussion of the Hessian. Sections 4.5 and 4.6 are not particularly relevant for us.
Please read one of the following papers; as with last time, take a maximum of thirty minutes.
- “A Comprehensive Survey of Deep Learning for Image Captioning”, Hossain et al (2018)
- “Listen, Attend, and Walk”, Mei et al (2015)
- “ALFRED: A Benchmark for Intrepreting Grounded Instructions”, Sridhar et al (2020)
Thu, Sep 15 - Reading Papers, Linear Algebra
- cs229 - notes on linear algebra: Please read sections 1, 2, and part of 3, through ‘norms’ but not after. Can you relate this material to your practice in
numpy
?
Please read one of the following papers. Take a maximum of thirty minutes. I’d like you to ask me a question about it in class!
- “Visual Question Answering”, Agrawal et al. (2016)
- “Walk the Talk”, MacMahon et al. (2006)
- “Experience Grounds Language”, Bisk et al. (2020)
Wed, Sep 14 - Robotics Research Survey, Linear Algebra
- Please read Tellex et al’s ‘Robots That Use Language’. Skimming is fine, but take note of what tasks interest you!
Mon, Sep 12 - Choosing a Research Topic
- Please review the updated research project page.
- my notes on choosing a research project
Thu, Sep 8 - Setting Up a Workstation
- please read the lab ahead of time, and bring a laptop
- lab 1 - workstation
Wed, Sep 7 - Course Introduction, State of NLP
- review the syllabus