CMU Neural Nets for NLP 2020 - Multitask and Multilingual Learning
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Remember, Neural Nets are Feature Extractors!
Reminder: Types of Learning
Standard Multi-task Learning
Selective Parameter Adaptation • Sometimes it is better to adapt only some of the parameters
Different Layers for Different Tasks (Hashimoto et al. 2017)
Multiple Annotation Standards
Supervised/Unsupervised Adaptation
Supervised Domain Adaptation through Feature Augmentation
Unsupervised Learning through Feature Matching
Multi-lingual Sequence-to- sequence Models
Multi-lingual Pre-training
Difficulties in Fully Multi- lingual Learning
Data Balancing
Cross-lingual Transfer Learning
What if languages don't share the same script?
Zero-shot Transfer to New Languages
Data Creation, Active Learning . In order to get in-language training data, Active Learning (AL) can be used
Taught by
Graham Neubig
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologías del lenguaje humano
Universidad de Alicante via Miríadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam