CMU Neural Nets for NLP 2020 - Multitask and Multilingual Learning
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Remember, Neural Nets are Feature Extractors!
Reminder: Types of Learning
Standard Multi-task Learning
Selective Parameter Adaptation • Sometimes it is better to adapt only some of the parameters
Different Layers for Different Tasks (Hashimoto et al. 2017)
Multiple Annotation Standards
Supervised/Unsupervised Adaptation
Supervised Domain Adaptation through Feature Augmentation
Unsupervised Learning through Feature Matching
Multi-lingual Sequence-to- sequence Models
Multi-lingual Pre-training
Difficulties in Fully Multi- lingual Learning
Data Balancing
Cross-lingual Transfer Learning
What if languages don't share the same script?
Zero-shot Transfer to New Languages
Data Creation, Active Learning . In order to get in-language training data, Active Learning (AL) can be used
Taught by
Graham Neubig
Related Courses
Structuring Machine Learning ProjectsDeepLearning.AI via Coursera Natural Language Processing on Google Cloud
Google Cloud via Coursera Introduction to Learning Transfer and Life Long Learning (3L)
University of California, Irvine via Coursera Advanced Deployment Scenarios with TensorFlow
DeepLearning.AI via Coursera Neural Style Transfer with TensorFlow
Coursera Project Network via Coursera