YoVDO

CMU Low Resource NLP Bootcamp 2020 - Neural Representation Learning

Offered By: Graham Neubig via YouTube

Tags

Natural Language Processing (NLP) Courses BERT Courses

Course Description

Overview

Explore neural representation learning in natural language processing through this comprehensive lecture from the CMU Low Resource NLP Bootcamp 2020. Delve into various methods for learning neural representations of language, covering topics such as word and sentence representations, supervised and unsupervised learning approaches, and case studies on NNLM, Glove, ELMO, and BERT. Gain insights into different structural biases, clusters of approaches, and must-know points about RNN, CNN, and Transformer models. Learn when to use non-contextualized versus contextualized representations and understand the importance of software, models, and corpora in neural representation learning for NLP.

Syllabus

Neural Representation Learning in Natural Language Processing
Neural Representation Learning for NLP
What is the word representation ?
Why should we learn word representation
How can we get word representations?
Symbolic or Distributed?
Supervised or Unsupervised?
Count-based or Prediction-based?
Case Study: NNLM
Case Study: Glove
Case Study: ELMO
Case Study: BERT
Software, Model, Corpus
Using non-contextualized when ...
Using contextualized when ...
What is the sentence representation
Why do we need sentence representations
How can we learn sentence representations?
Different Structural Biases
Clusters of Approaches
Case Study: Must-know Points about RN
CNN: 1d and 2d Convolution
CNN: Narrow/Equal/Wide Convolution
CNN: Multiple Filter Convolution
Case Study: Must-know Points about Transforme


Taught by

Graham Neubig

Related Courses

Natural Language Processing
Columbia University via Coursera
Natural Language Processing
Stanford University via Coursera
Introduction to Natural Language Processing
University of Michigan via Coursera
moocTLH: Nuevos retos en las tecnologĂ­as del lenguaje humano
Universidad de Alicante via MirĂ­adax
Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam