Convergence of Denoising Diffusion Models Under the Manifold Hypothesis
Offered By: Alan Turing Institute via YouTube
Course Description
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the theoretical foundations of denoising diffusion models in this 46-minute lecture by Valentin de Bortoli from CNRS, France. Delve into the convergence analysis of these state-of-the-art generative models for image and audio synthesis, focusing on scenarios where the target distribution is supported on a lower-dimensional manifold or given by an empirical distribution. Examine quantitative bounds on the Wasserstein distance between the target data distribution and the generative distribution of diffusion models. Gain insights into the theoretical underpinnings of these models, addressing limitations in current approaches that assume target density admits a density with respect to the Lebesgue measure.
Syllabus
Convergence of denoising diffusion models under the manifold hypothesis
Taught by
Alan Turing Institute
Related Courses
Optimal Transport for Inverse Problems and Implicit RegularizationSociety for Industrial and Applied Mathematics via YouTube Universality of Persistence Diagrams and Bottleneck & Wasserstein Distances
Applied Algebraic Topology Network via YouTube Analysis of Mean-Field Games - Lecture 1
International Centre for Theoretical Sciences via YouTube Wasserstein Distributionally Robust Optimization - Theory and Applications in Machine Learning
Institute for Pure & Applied Mathematics (IPAM) via YouTube DeepParticle: Learning Invariant Measure by Deep Neural Network Minimizing Wasserstein Distance
Inside Livermore Lab via YouTube