Neural Nets for NLP 2017 - Multilingual and Multitask Learning
Offered By: Graham Neubig via YouTube
Course Description
Overview
Explore multilingual and multitask learning in neural networks for natural language processing through this 52-minute lecture by Graham Neubig. Delve into key concepts such as multitask learning, domain adaptation, and multilingual learning. Gain insights on increasing data through multitask approaches, pre-training encoders, and regularization techniques. Examine supervised and unsupervised domain adaptation methods, multilingual inputs and outputs, and teacher-student networks for multilingual adaptation. Understand various types of multi-tasking and multiple annotation standards in NLP tasks. Access accompanying slides and related course materials for a comprehensive learning experience in advanced NLP techniques.
Syllabus
Intro
Remember, Neural Nets are Feature Extractors!
Types of Learning
Plethora of Tasks in NLP
Rule of Thumb 1: Multitask to Increase Data
Rule of Thumb 2
Standard Multi-task Learning
Examples of Pre-training Encoders
Regularization for Pre-training (e.g. Barone et al. 2017)
Selective Parameter Adaptation
Soft Parameter Tying
Supervised/Unsupervised Adaptation
Supervised Domain Adaptation through Feature Augmentation
Unsupervised Learning through Feature Matching
Multilingual Inputs
Multilingual Structured Prediction/ Multilingual Outputs
Teacher-student Networks for Multilingual Adaptation (Chen et al. 2017)
Types of Multi-tasking
Multiple Annotation Standards
Taught by
Graham Neubig
Related Courses
TensorFlow Developer Certificate Exam PrepA Cloud Guru Post Graduate Certificate in Advanced Machine Learning & AI
Indian Institute of Technology Roorkee via Coursera Advanced AI Techniques for the Supply Chain
LearnQuest via Coursera Advanced Learning Algorithms
DeepLearning.AI via Coursera IBM AI Engineering
IBM via Coursera