YoVDO

Convergence of Denoising Diffusion Models Under the Manifold Hypothesis

Offered By: Alan Turing Institute via YouTube

Tags

Generative Models Courses Machine Learning Courses Image Synthesis Courses Audio Synthesis Courses Wasserstein Distances Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the theoretical foundations of denoising diffusion models in this 46-minute lecture by Valentin de Bortoli from CNRS, France. Delve into the convergence analysis of these state-of-the-art generative models for image and audio synthesis, focusing on scenarios where the target distribution is supported on a lower-dimensional manifold or given by an empirical distribution. Examine quantitative bounds on the Wasserstein distance between the target data distribution and the generative distribution of diffusion models. Gain insights into the theoretical underpinnings of these models, addressing limitations in current approaches that assume target density admits a density with respect to the Lebesgue measure.

Syllabus

Convergence of denoising diffusion models under the manifold hypothesis


Taught by

Alan Turing Institute

Related Courses

Regularization for Optimal Transport and Dynamic Time Warping Distances - Marco Cuturi
Alan Turing Institute via YouTube
Analysis of Mean-Field Games - Lecture 1
International Centre for Theoretical Sciences via YouTube
Why Should Q=P in the Wasserstein Distance Between Persistence Diagrams?
Applied Algebraic Topology Network via YouTube
Washington Mio - Stable Homology of Metric Measure Spaces
Applied Algebraic Topology Network via YouTube
Wasserstein Distributionally Robust Optimization - Theory and Applications in Machine Learning
Institute for Pure & Applied Mathematics (IPAM) via YouTube