Diffusion Models for Inverse Problems
Offered By: Generative Memory Lab via YouTube
Course Description
Overview
Explore cutting-edge research on diffusion models for inverse problems in this 42-minute conference talk by Hyungjin Chung from the Generative Memory Lab. Delve into two groundbreaking papers: "Diffusion posterior sampling for general noisy inverse problems" and "Improving diffusion models for inverse problems using manifold constraints." Gain insights into novel approaches for addressing inverse problems using diffusion models, including posterior sampling techniques and the application of manifold constraints to enhance model performance.
Syllabus
Diffusion Models for Inverse Problems
Taught by
Generative Memory Lab
Related Courses
Diffusion Models Beat GANs on Image Synthesis - Machine Learning Research Paper ExplainedYannic Kilcher via YouTube Diffusion Models Beat GANs on Image Synthesis - ML Coding Series - Part 2
Aleksa Gordić - The AI Epiphany via YouTube OpenAI GLIDE - Towards Photorealistic Image Generation and Editing with Text-Guided Diffusion Models
Aleksa Gordić - The AI Epiphany via YouTube Food for Diffusion
HuggingFace via YouTube Imagen: Text-to-Image Generation Using Diffusion Models - Lecture 9
University of Central Florida via YouTube