Mirrored Langevin Dynamics - Ya-Ping Hsieh
Offered By: Alan Turing Institute via YouTube
Course Description
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 39-minute conference talk on Mirrored Langevin Dynamics presented by Ya-Ping Hsieh at the Alan Turing Institute. Delve into a unified framework for posterior sampling in constrained distributions, with a focus on Latent Dirichlet Allocation (LDA). Discover novel deterministic and stochastic first-order sampling schemes inspired by mirror descent. Learn about the improved convergence rate of O(epsilon^{-2}d) for general target distributions with strongly convex potential, significantly advancing the current state-of-the-art. Examine the specialized algorithm for sampling from Dirichlet posteriors, featuring the first non-asymptotic O(\epsilon^{-2}d^2 R_0) rate for first-order sampling. Explore the extension of the deterministic framework to mini-batch settings and its convergence rates with stochastic gradients. Gain insights into state-of-the-art experimental results for LDA on real datasets, bridging theoretical foundations with practical applications in statistics, probability, and optimization.
Syllabus
Mirrored Langevin Dynamics - Ya-Ping Hsieh
Taught by
Alan Turing Institute
Related Courses
4.0 Shades of Digitalisation for the Chemical and Process IndustriesUniversity of Padova via FutureLearn A Day in the Life of a Data Engineer
Amazon Web Services via AWS Skill Builder FinTech for Finance and Business Leaders
ACCA via edX Accounting Data Analytics
University of Illinois at Urbana-Champaign via Coursera Accounting Data Analytics
Coursera