YoVDO

Neural Nets for NLP - Latent Random Variables

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Machine Learning Courses Natural Language Processing (NLP) Courses Variational Autoencoders Courses Generative Models Courses

Course Description

Overview

Explore latent random variables in neural networks for natural language processing through this comprehensive lecture from CMU's CS 11-747 course. Delve into the distinctions between generative and discriminative models, as well as deterministic and random variables. Examine variational autoencoders, their architecture, and challenges in training. Learn techniques for handling discrete latent variables, including enumeration, sampling, and reparameterization. Discover practical applications of variational models in language processing, controllable text generation, and symbol sequence modeling. Gain insights from examples and case studies presented throughout the lecture to deepen your understanding of these advanced NLP concepts.

Syllabus

Intro
Discriminative vs. Generative Models • Discriminative model: calculate the probability of output given
Quiz: What Types of Variables? • In the an attentional sequence-to-sequence model using MLE/teacher forcing, are the following variables observed or latent? deterministic or random?
Why Latent Random Variable
What is Latent Random Variable Model
A Latent Variable Model
An Example (Goersch 2016)
Variational Inference
Practice
Variational Autoencoders
VAE vs. AE
Problem! Sampling Breaks Backprop
Solution: Re-parameterization Trick
Motivation for Latent Variables • Allows for a consistent latent space of sentences?
Difficulties in Training
KL Divergence Annealing • Basic idea: Multiply KL term by a constant starting at zero, then gradually increase to 1 • Result: model can learn to use z before getting penalized
Solution 2: Weaken the Decoder . But theoretically still problematic: it can be shown that the optimal strategy is to ignore z when it is not necessary (Chen et al. 2017)
Aggressive Inference Network Learning
Discrete Latent Variables?
Enumeration
Method 2: Sampling • Randomly sample a subset of configurations of z and optimize with respect to this subset
Method 3: Reparameterization (Maddison et al. 2017, Jang et al. 2017)
Variational Models of Language Processing (Miao et al. 2016) • Present models with random variables for document modeling and question answer pair selection
Controllable Text Generation (Hu et al. 2017)
Symbol Sequence Latent Variables (Miao and Blunsom 2016) • Encoder-decoder with a sequence of latent symbols


Taught by

Graham Neubig

Related Courses

GenAI and Model Selection
Coursera Instructor Network via Coursera
Generative AI and LLMs: Architecture and Data Preparation
IBM via Coursera
Recommender Systems Complete Course Beginner to Advanced
Packt via Coursera
Image Compression and Generation using Variational Autoencoders in Python
Coursera Project Network via Coursera
What Is Generative AI?
LinkedIn Learning