Neural Nets for NLP - Latent Variable Models
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Discriminative vs. Generative Models
Quiz: What Types of Variables?
Why Latent Random Variables?
A Latent Variable Model
What is Our Loss Function? . We would like to maximize the corpus log likelihood
Disconnect Between Samples and Objective
VAE Objective . We can create an optimizable objective matching our problem, starting with KL divergence
Interpreting the VAE Objective
Problem: Straightforward Sampling is Inefficient Current
Problem! Sampling Breaks Backprop
Solution: Re-parameterization Trick
Generating from Language Models
Motivation for Latent Variables
Difficulties in Training
KL Divergence Annealing
Weaken the Decoder
Discrete Latent Variables?
Enumeration
Method 2: Sampling
Reparameterization (Maddison et al. 2017. Jang et al. 2017)
Taught by
Graham Neubig
Related Courses
Building a unique NLP project: 1984 book vs 1984 albumCoursera Project Network via Coursera Exam Prep AI-102: Microsoft Azure AI Engineer Associate
Whizlabs via Coursera Amazon Echo Reviews Sentiment Analysis Using NLP
Coursera Project Network via Coursera Amazon Translate: Translate documents with batch translation
Coursera Project Network via Coursera Analyze Text Data with Yellowbrick
Coursera Project Network via Coursera