YoVDO

CMU Multilingual NLP 2022 - Unsupervised Machine Translation

Offered By: Graham Neubig via YouTube

Tags

Natural Language Processing (NLP) Courses Machine Learning Courses

Course Description

Overview

Explore unsupervised machine translation techniques in this 43-minute lecture by Graham Neubig. Delve into topics such as unsupervised pre-training for language models and sequence-to-sequence models, initialization methods including unsupervised word translation and adversarial techniques, and the fundamentals of phrase-based statistical MT. Learn about the training objectives and performance metrics for unsupervised MT systems, and understand the crucial role of back-translation in improving translation quality. Gain insights into approaches for tackling translation tasks without parallel data and discover methods for collecting or generating necessary data.

Syllabus

Intro
Conditional Text Generation
What if we don't have parallel data?
Can't we just collect/generate the data?
Unsupervised Translation
Outline
Step 1: Initialization
Initialization: Unsupervised Word Translatic
Unsupervised Word Translation: Adversarial T
One slide primer on phrase-based statistical
Unsupervised Statistical MT
Unsupervised MT: Training Objective 1
How does it work?
Step 2: Back-translation
Performance


Taught by

Graham Neubig

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent