Neural Nets for NLP 2017 - Multilingual and Multitask Learning
Offered By: Graham Neubig via YouTube
Course Description
Overview
Explore multilingual and multitask learning in neural networks for natural language processing through this 52-minute lecture by Graham Neubig. Delve into key concepts such as multitask learning, domain adaptation, and multilingual learning. Gain insights on increasing data through multitask approaches, pre-training encoders, and regularization techniques. Examine supervised and unsupervised domain adaptation methods, multilingual inputs and outputs, and teacher-student networks for multilingual adaptation. Understand various types of multi-tasking and multiple annotation standards in NLP tasks. Access accompanying slides and related course materials for a comprehensive learning experience in advanced NLP techniques.
Syllabus
Intro
Remember, Neural Nets are Feature Extractors!
Types of Learning
Plethora of Tasks in NLP
Rule of Thumb 1: Multitask to Increase Data
Rule of Thumb 2
Standard Multi-task Learning
Examples of Pre-training Encoders
Regularization for Pre-training (e.g. Barone et al. 2017)
Selective Parameter Adaptation
Soft Parameter Tying
Supervised/Unsupervised Adaptation
Supervised Domain Adaptation through Feature Augmentation
Unsupervised Learning through Feature Matching
Multilingual Inputs
Multilingual Structured Prediction/ Multilingual Outputs
Teacher-student Networks for Multilingual Adaptation (Chen et al. 2017)
Types of Multi-tasking
Multiple Annotation Standards
Taught by
Graham Neubig
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologĂas del lenguaje humano
Universidad de Alicante via MirĂadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam