YoVDO

Mirrored Langevin Dynamics - Ya-Ping Hsieh

Offered By: Alan Turing Institute via YouTube

Tags

Latent Dirichlet Allocation Courses Statistics & Probability Courses Machine Learning Courses Probability Courses Algorithmic Design Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 39-minute conference talk on Mirrored Langevin Dynamics presented by Ya-Ping Hsieh at the Alan Turing Institute. Delve into a unified framework for posterior sampling in constrained distributions, with a focus on Latent Dirichlet Allocation (LDA). Discover novel deterministic and stochastic first-order sampling schemes inspired by mirror descent. Learn about the improved convergence rate of O(epsilon^{-2}d) for general target distributions with strongly convex potential, significantly advancing the current state-of-the-art. Examine the specialized algorithm for sampling from Dirichlet posteriors, featuring the first non-asymptotic O(\epsilon^{-2}d^2 R_0) rate for first-order sampling. Explore the extension of the deterministic framework to mini-batch settings and its convergence rates with stochastic gradients. Gain insights into state-of-the-art experimental results for LDA on real datasets, bridging theoretical foundations with practical applications in statistics, probability, and optimization.

Syllabus

Mirrored Langevin Dynamics - Ya-Ping Hsieh


Taught by

Alan Turing Institute

Related Courses

Design of Computer Programs
Stanford University via Udacity
Intro to Statistics
Stanford University via Udacity
Health in Numbers: Quantitative Methods in Clinical & Public Health Research
Harvard University via edX
Mathematical Biostatistics Boot Camp 1
Johns Hopkins University via Coursera
Statistics
San Jose State University via Udacity