Neural Nets for NLP 2017 - Multilingual and Multitask Learning
Offered By: Graham Neubig via YouTube
Course Description
Overview
Explore multilingual and multitask learning in neural networks for natural language processing through this 52-minute lecture by Graham Neubig. Delve into key concepts such as multitask learning, domain adaptation, and multilingual learning. Gain insights on increasing data through multitask approaches, pre-training encoders, and regularization techniques. Examine supervised and unsupervised domain adaptation methods, multilingual inputs and outputs, and teacher-student networks for multilingual adaptation. Understand various types of multi-tasking and multiple annotation standards in NLP tasks. Access accompanying slides and related course materials for a comprehensive learning experience in advanced NLP techniques.
Syllabus
Intro
Remember, Neural Nets are Feature Extractors!
Types of Learning
Plethora of Tasks in NLP
Rule of Thumb 1: Multitask to Increase Data
Rule of Thumb 2
Standard Multi-task Learning
Examples of Pre-training Encoders
Regularization for Pre-training (e.g. Barone et al. 2017)
Selective Parameter Adaptation
Soft Parameter Tying
Supervised/Unsupervised Adaptation
Supervised Domain Adaptation through Feature Augmentation
Unsupervised Learning through Feature Matching
Multilingual Inputs
Multilingual Structured Prediction/ Multilingual Outputs
Teacher-student Networks for Multilingual Adaptation (Chen et al. 2017)
Types of Multi-tasking
Multiple Annotation Standards
Taught by
Graham Neubig
Related Courses
Introduction to Deep LearningMassachusetts Institute of Technology via YouTube Taming Dataset Bias via Domain Adaptation
Alexander Amini via YouTube Making Our Models Robust to Changing Visual Environments
Andreas Geiger via YouTube Learning Compact Representation with Less Labeled Data from Sensors
tinyML via YouTube Geo-localization Framework for Real-world Scenarios - Defense Presentation
University of Central Florida via YouTube