CMU Multilingual NLP - Machine Translation-Sequence-to-Sequence Models
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Language Models • Language models are generative models of text
Conditioned Language Models
Calculating the Probability of a Sentence
Conditional Language Models
One Type of Language Model Mikolov et al. 2011
How to Pass Hidden State?
The Generation Problem
Ancestral Sampling
Greedy Search
Beam Search
Sentence Representations
Calculating Attention (1)
A Graphical Example
Attention Score Functions (1)
Attention is not Alignment! (Koehn and Knowles 2017)
Coverage
Multi-headed Attention
Supervised Training (Liu et al. 2016)
Self Attention (Cheng et al. 2016) • Each element in the sentence attends to other
Why Self Attention?
Transformer Attention Tricks
Transformer Training Tricks
Masking for Training . We want to perform training in as few operations as possible using big matrix multiplies
A Unified View of Sequence- to-sequence Models
Code Walk
Taught by
Graham Neubig
Related Courses
Generative AI Language Modeling with TransformersIBM via Coursera Transformer Models and BERT Model - Deutsch
Google Cloud via Coursera Generative AI: Introduction to Large Language Models
LinkedIn Learning Generative AI: Working with Large Language Models
LinkedIn Learning TensorFlow: Working with NLP
LinkedIn Learning