YoVDO

Estimating Normalizing Constants for Log-Concave Distributions - Algorithms and Lower Bounds

Offered By: Association for Computing Machinery (ACM) via YouTube

Tags

Machine Learning Courses Algorithms Courses Langevin Dynamics Courses

Course Description

Overview

Explore algorithms and lower bounds for estimating normalizing constants of log-concave distributions in this 26-minute ACM conference talk. Delve into problem statement, upper bounds using annealing and multilevel Monte Carlo techniques, and regular Monte Carlo methods. Examine sampling algorithms, focusing on Langevin dynamics, its discretization, and coupling. Investigate lower bounds for both low and high dimensions, including proof ideas and distinguishing biased coins. Conclude with insights on partitioning dimensions and key takeaways for computational statistics and machine learning applications.

Syllabus

Intro
Problem statement
Upper bound: Annealing
Upper bound: Multilevel Monte Carlo
Regular Monte Carlo
Sampling algorithm: Langevin dynamics
Discretizing Langevin dynamics
Coupling Langevin dynamics (Overdamped)
Lower bound for low dimensions
Proof idea
Distinguishing biased coins
Lower bound for high dimensions Take product distribution Partition de dimensions into
Conclusion


Taught by

Association for Computing Machinery (ACM)

Related Courses

Insights on Gradient-Based Algorithms in High-Dimensional Learning
Simons Institute via YouTube
Sampling Using Diffusion Processes, From Langevin to Schrödinger
Simons Institute via YouTube
Stochastic Analysis and Geometric Functional Inequalities
Hausdorff Center for Mathematics via YouTube
Speeding up Langevin Dynamics by Mixing
Society for Industrial and Applied Mathematics via YouTube
Optimal Transport and High-Dimensional Probability - Langevin Dynamics via Calculus
Institute for Mathematical Sciences via YouTube