YoVDO

CMU Multilingual NLP 2022 - Unsupervised Machine Translation

Offered By: Graham Neubig via YouTube

Tags

Natural Language Processing (NLP) Courses Machine Learning Courses

Course Description

Overview

Explore unsupervised machine translation techniques in this 43-minute lecture by Graham Neubig. Delve into topics such as unsupervised pre-training for language models and sequence-to-sequence models, initialization methods including unsupervised word translation and adversarial techniques, and the fundamentals of phrase-based statistical MT. Learn about the training objectives and performance metrics for unsupervised MT systems, and understand the crucial role of back-translation in improving translation quality. Gain insights into approaches for tackling translation tasks without parallel data and discover methods for collecting or generating necessary data.

Syllabus

Intro
Conditional Text Generation
What if we don't have parallel data?
Can't we just collect/generate the data?
Unsupervised Translation
Outline
Step 1: Initialization
Initialization: Unsupervised Word Translatic
Unsupervised Word Translation: Adversarial T
One slide primer on phrase-based statistical
Unsupervised Statistical MT
Unsupervised MT: Training Objective 1
How does it work?
Step 2: Back-translation
Performance


Taught by

Graham Neubig

Related Courses

Natural Language Processing
Columbia University via Coursera
Natural Language Processing
Stanford University via Coursera
Introduction to Natural Language Processing
University of Michigan via Coursera
moocTLH: Nuevos retos en las tecnologĂ­as del lenguaje humano
Universidad de Alicante via MirĂ­adax
Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam