CMU Low Resource NLP Bootcamp 2020 - Neural Representation Learning
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Neural Representation Learning in Natural Language Processing
Neural Representation Learning for NLP
What is the word representation ?
Why should we learn word representation
How can we get word representations?
Symbolic or Distributed?
Supervised or Unsupervised?
Count-based or Prediction-based?
Case Study: NNLM
Case Study: Glove
Case Study: ELMO
Case Study: BERT
Software, Model, Corpus
Using non-contextualized when ...
Using contextualized when ...
What is the sentence representation
Why do we need sentence representations
How can we learn sentence representations?
Different Structural Biases
Clusters of Approaches
Case Study: Must-know Points about RN
CNN: 1d and 2d Convolution
CNN: Narrow/Equal/Wide Convolution
CNN: Multiple Filter Convolution
Case Study: Must-know Points about Transforme
Taught by
Graham Neubig
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologĂas del lenguaje humano
Universidad de Alicante via MirĂadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam