Implicit MLE- Backpropagating Through Discrete Exponential Family Distributions
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore a comprehensive video explanation of the paper "Implicit MLE: Backpropagating Through Discrete Exponential Family Distributions". Delve into the challenges of incorporating discrete probability distributions and combinatorial optimization problems with neural networks. Learn about the Implicit Maximum Likelihood Estimation (I-MLE) framework for end-to-end learning of models combining discrete exponential family distributions and differentiable neural components. Discover how I-MLE enables backpropagation through discrete algorithms, allowing combinatorial optimizers to be part of a network's forward propagation. Follow along as the video breaks down key concepts, including the straight-through estimator, encoding discrete problems as inner products, and approximating marginals via perturb-and-MAP. Gain insights into the paper's contributions, methodology, and practical applications through detailed explanations and visual aids.
Syllabus
- Intro & Overview
- Sponsor: Weights & Biases
- Problem Setup & Contributions
- Recap: Straight-Through Estimator
- Encoding the discrete problem as an inner product
- From algorithm to distribution
- Substituting the gradient
- Defining a target distribution
- Approximating marginals via perturb-and-MAP
- Entire algorithm recap
- Github Page & Example
Taught by
Yannic Kilcher
Related Courses
Generalization IISimons Institute via YouTube Deciphering Brain Codes to Build Smarter AI
MITCBMM via YouTube Weaving Together Machine Learning, Theoretical Physics, and Neuroscience
Fields Institute via YouTube Prediction of Survival Analysis for Cancer Patients
International Centre for Theoretical Sciences via YouTube Artificial Intelligence in Medical Imaging for Precision Medicine by Vaanathi Sundaresan
International Centre for Theoretical Sciences via YouTube