Neural Nets for NLP - Latent Variable Models
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Discriminative vs. Generative Models
Quiz: What Types of Variables?
Why Latent Random Variables?
A Latent Variable Model
What is Our Loss Function? . We would like to maximize the corpus log likelihood
Disconnect Between Samples and Objective
VAE Objective . We can create an optimizable objective matching our problem, starting with KL divergence
Interpreting the VAE Objective
Problem: Straightforward Sampling is Inefficient Current
Problem! Sampling Breaks Backprop
Solution: Re-parameterization Trick
Generating from Language Models
Motivation for Latent Variables
Difficulties in Training
KL Divergence Annealing
Weaken the Decoder
Discrete Latent Variables?
Enumeration
Method 2: Sampling
Reparameterization (Maddison et al. 2017. Jang et al. 2017)
Taught by
Graham Neubig
Related Courses
Visual Recognition & UnderstandingUniversity at Buffalo via Coursera Deep Learning for Computer Vision
IIT Hyderabad via Swayam Deep Learning in Life Sciences - Spring 2021
Massachusetts Institute of Technology via YouTube Advanced Deep Learning Methods for Healthcare
University of Illinois at Urbana-Champaign via Coursera Generative Models
Serrano.Academy via YouTube