CMU Low Resource NLP Bootcamp 2020 - Neural Representation Learning
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Neural Representation Learning in Natural Language Processing
Neural Representation Learning for NLP
What is the word representation ?
Why should we learn word representation
How can we get word representations?
Symbolic or Distributed?
Supervised or Unsupervised?
Count-based or Prediction-based?
Case Study: NNLM
Case Study: Glove
Case Study: ELMO
Case Study: BERT
Software, Model, Corpus
Using non-contextualized when ...
Using contextualized when ...
What is the sentence representation
Why do we need sentence representations
How can we learn sentence representations?
Different Structural Biases
Clusters of Approaches
Case Study: Must-know Points about RN
CNN: 1d and 2d Convolution
CNN: Narrow/Equal/Wide Convolution
CNN: Multiple Filter Convolution
Case Study: Must-know Points about Transforme
Taught by
Graham Neubig
Related Courses
Sentiment Analysis with Deep Learning using BERTCoursera Project Network via Coursera Natural Language Processing with Attention Models
DeepLearning.AI via Coursera Fine Tune BERT for Text Classification with TensorFlow
Coursera Project Network via Coursera Deploy a BERT question answering bot on Django
Coursera Project Network via Coursera Generating discrete sequences: language and music
Ural Federal University via edX