Pre-Trained Multilingual Sequence to Sequence Models for NMT - Tips, Tricks and Challenges
Offered By: Toronto Machine Learning Series (TMLS) via YouTube
Course Description
Overview
Explore the world of Neural Machine Translation (NMT) in this comprehensive 90-minute tutorial presented by Annie En-Shiun Lee, Assistant Professor at the University of Toronto's Computer Science Department. Delve into the rapid evolution of NMT and the power of Pre-trained Multilingual Sequence to Sequence (PMSS) models like mBART and mT5. Learn how these models, pre-trained on extensive general data, can be fine-tuned for impressive results in various natural language tasks. Gain insights into adapting pre-trained models for NMT, discover essential tips and tricks for training and evaluation, and understand the challenges faced when implementing these models. Whether you're approaching NMT from a research or industry perspective, this tutorial offers valuable knowledge to enhance your understanding and application of cutting-edge translation technology.
Syllabus
Pre-Trained Multilingual Sequence to Sequence Models for NMT Tips, Tricks and Challenges
Taught by
Toronto Machine Learning Series (TMLS)
Related Courses
Simple Recurrent Neural Network with KerasCoursera Project Network via Coursera Deep Learning: Advanced Natural Language Processing and RNNs
Udemy Machine Translation
Great Learning via YouTube Tensorflow 2.0 | Recurrent Neural Networks, LSTMs, GRUs
Udemy Pytorch Transformers for Machine Translation
Aladdin Persson via YouTube