CMU Multilingual NLP 2020 - Advanced Text Classification-Labeling
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Text Classification
Sequence Labeling Given an input text X, predict an output label sequence of equal length
Reminder: Bi-RNNS - Simple and standard model for sequence labeling for classification
Issues w/ Simple BiRNN
Alternative: Bag of n-grams
Unknown Words
Sub-word Segmentation
Unsupervised Subword Segmentation Algorithms
Sub-word Based Embeddings
Sub-word Based Embedding Models
Embeddings for Cross-lingual Learning: Soft Decoupled Encoding
Labeled/Unlabeled Data Problem: we have very little labeled data for most analysis tasks for most languages
Joint Multi-task Learning
Pre-training
Masked Language Modeling
Thinking about Multi-tasking, and Pre-trained Representations
Other Monolingual BERTS
XTREME: Comparing Multilingual Representations
Why Call it "Structured" Prediction?
Why Model Interactions in Output?
Local Normalization vs. Global Normalization
Potential Functions
Discussion
Taught by
Graham Neubig
Related Courses
Structuring Machine Learning ProjectsDeepLearning.AI via Coursera Структурирование проектов по машинному обучению
DeepLearning.AI via Coursera 머신 러닝 프로젝트 구조화
DeepLearning.AI via Coursera Stanford CS330: Deep Multi-Task and Meta Learning
Stanford University via YouTube Stanford Seminar - The Next Generation of Robot Learning
Stanford University via YouTube