On the Critic Function of Implicit Generative Models - Arthur Gretton
Offered By: Institute for Advanced Study via YouTube
Course Description
Overview
Explore the critic function of implicit generative models in this comprehensive seminar on Theoretical Machine Learning. Delve into divergence measures, variational forms, and topological properties as Arthur Gretton from University College London discusses generalized energy-based models and their applications. Examine the advantages and disadvantages of various approaches, including neural net divergence and generalized likelihood. Gain insights into smoothness properties, multimodality, and the challenges of jumping between modes in generative models. Understand the risks of memorization and the importance of realistic sampling in machine learning applications.
Syllabus
Introduction
Outline
Divergence measures
The critic function
Variational form
Lower bound
Topological properties
Disadvantages of kl
Generalized energybased models
The generator
Generalised energybased models
Generalised likelihood
Graphical example
Energy function
Sampling
Realistic
Neural net divergence
How close is Q to P
Will I hit P
Smoothness properties
Jumping from mode to mode
I was happy to see it go from mode to mode
Risk of memorization
Generalized energybased model
Generalized likelihood
Multimodality
Smooth functions
The kernel beer
Mark
Taught by
Institute for Advanced Study
Related Courses
A Path Towards Autonomous Machine Intelligence - Paper ExplainedYannic Kilcher via YouTube Author Interview - VOS- Learning What You Don't Know by Virtual Outlier Synthesis
Yannic Kilcher via YouTube Self-Supervised Learning - The Dark Matter of Intelligence
Yannic Kilcher via YouTube Backpropagation and Deep Learning in the Brain
Simons Institute via YouTube Your Brain on Energy-Based Models - Applying and Scaling EBMs to Problems
Institute for Advanced Study via YouTube