Generative Modeling by Estimating Gradients of the Data Distribution - Stefano Ermon
Offered By: Institute for Advanced Study via YouTube
Course Description
Overview
Explore a comprehensive seminar on generative modeling techniques focusing on estimating gradients of data distributions. Delve into implicit generative models, deep energy-based models, and score estimation methods presented by Stanford University's Stefano Ermon at the Institute for Advanced Study. Learn about the progress in generative models for text, representation of probability distributions, and scalable learning techniques using sliced score matching. Discover potential pitfalls in sample generation, including the manifold hypothesis and inaccurate score estimation in low data-density regions. Examine innovative solutions such as Gaussian perturbation and annealed Langevin dynamics. Gain insights into joint score estimation and practical experiments demonstrating the effectiveness of these approaches in various sampling scenarios.
Syllabus
Intro
Progress in generative models of text
Implicit Generative Models Implicit models: directly represent the sampling process
Representation of Probability Distributions
Learning Deep Energy-Based Models using Scores
Learning with Sliced Score Matching
Experiments: Scalability and Speed
Experiments: Fitting Deep Kernel Exponential Families
From Score Estimation to Sample Generation
Pitfall 1: Manifold Hypothesis
Pitfall 2: Inaccurate Score Estimation in Low Data-Density Regions
Data Modes
Gaussian Perturbation
Annealed Langevin Dynamics
Joint Score Estimation
Experiments: Sampling
Taught by
Institute for Advanced Study
Related Courses
4.0 Shades of Digitalisation for the Chemical and Process IndustriesUniversity of Padova via FutureLearn A Day in the Life of a Data Engineer
Amazon Web Services via AWS Skill Builder FinTech for Finance and Business Leaders
ACCA via edX Accounting Data Analytics
University of Illinois at Urbana-Champaign via Coursera Accounting Data Analytics
Coursera