Neural SDEs - Deep Generative Models in the Diffusion Limit - Maxim Raginsky
Offered By: Institute for Advanced Study via YouTube
Course Description
Overview
Explore the diffusion limit of deep generative models in this comprehensive lecture. Delve into the unified perspective on sampling and variational inference through stochastic control. Learn how to quantify the expressiveness of diffusion-based generative models and discover efficient sampling techniques for a wide class of terminal target distributions. Examine the proof of sampling accuracy using Kullback-Leibler divergence and investigate an unbiased, finite-variance simulation scheme implementable as a deep generative model with a random number of layers. Cover topics such as continuous-time neural nets, parabolic PDEs, optimal control, the Schrödinger Bridge Problem, and nonparametric sampling, while gaining insights into empirical process techniques.
Syllabus
Introduction
What is a generative model
Diffusion process
Continuoustime neural nets
Questions
Coming up
Exact Sampling
Parabolic PDEs
Optimal Control
Schroedinger Bridge Problem
Variational Inference
Nonparametric Sampling
Proof
Empirical Process Techniques
Taught by
Institute for Advanced Study
Related Courses
Digital Signal ProcessingÉcole Polytechnique Fédérale de Lausanne via Coursera Preparing for the AP* Statistics Exam
University of Houston System via Coursera Solid Science: Research Methods
University of Amsterdam via Coursera Preparing for the AP* Statistics Exam
Tennessee Board of Regents via edX Processamento Digital de Sinais - Amostragem
Universidade Estadual de Campinas via Coursera