Neural Nets for NLP - Multi-task, Multi-lingual Learning
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Remember, Neural Nets are Feature Extractors!
Types of Learning
Plethora of Tasks in NLP
Rule of Thumb 1: Multitask to Increase Data
Rule of Thumb 2
Standard Multi-task Learning
Examples of Pre-training Encoders . Common to pre-train encoders for downstream tasks, common to use
Regularization for Pre-training (e.g. Barone et al. 2017) Pre-training relies on the fact that we won't move too far from the
Selective Parameter Adaptation Sometimes it is better to adapt only some of the parameters
Soft Parameter Tying
Supervised Domain Adaptation through Feature Augmentation
Unsupervised Learning through Feature Matching
Multilingual Structured Prediction/ Multilingual Outputs • Things are harder when predicting a sequence of actions (parsing) or words (MT) in different languages
Multi-lingual Sequence-to- sequence Models
Types of Multi-tasking
Multiple Annotation Standards
Different Layers for Different
Summary of design dimensions
Taught by
Graham Neubig
Related Courses
TensorFlow Developer Certificate Exam PrepA Cloud Guru Post Graduate Certificate in Advanced Machine Learning & AI
Indian Institute of Technology Roorkee via Coursera Advanced AI Techniques for the Supply Chain
LearnQuest via Coursera Advanced Learning Algorithms
DeepLearning.AI via Coursera IBM AI Engineering
IBM via Coursera