YoVDO

Neural Nets for NLP 2017 - Multilingual and Multitask Learning

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Natural Language Processing (NLP) Courses Domain Adaptation Courses

Course Description

Overview

Explore multilingual and multitask learning in neural networks for natural language processing through this 52-minute lecture by Graham Neubig. Delve into key concepts such as multitask learning, domain adaptation, and multilingual learning. Gain insights on increasing data through multitask approaches, pre-training encoders, and regularization techniques. Examine supervised and unsupervised domain adaptation methods, multilingual inputs and outputs, and teacher-student networks for multilingual adaptation. Understand various types of multi-tasking and multiple annotation standards in NLP tasks. Access accompanying slides and related course materials for a comprehensive learning experience in advanced NLP techniques.

Syllabus

Intro
Remember, Neural Nets are Feature Extractors!
Types of Learning
Plethora of Tasks in NLP
Rule of Thumb 1: Multitask to Increase Data
Rule of Thumb 2
Standard Multi-task Learning
Examples of Pre-training Encoders
Regularization for Pre-training (e.g. Barone et al. 2017)
Selective Parameter Adaptation
Soft Parameter Tying
Supervised/Unsupervised Adaptation
Supervised Domain Adaptation through Feature Augmentation
Unsupervised Learning through Feature Matching
Multilingual Inputs
Multilingual Structured Prediction/ Multilingual Outputs
Teacher-student Networks for Multilingual Adaptation (Chen et al. 2017)
Types of Multi-tasking
Multiple Annotation Standards


Taught by

Graham Neubig

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX