YoVDO

Mirrored Langevin Dynamics - Ya-Ping Hsieh

Offered By: Alan Turing Institute via YouTube

Tags

Latent Dirichlet Allocation Courses Statistics & Probability Courses Machine Learning Courses Probability Courses Algorithmic Design Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 39-minute conference talk on Mirrored Langevin Dynamics presented by Ya-Ping Hsieh at the Alan Turing Institute. Delve into a unified framework for posterior sampling in constrained distributions, with a focus on Latent Dirichlet Allocation (LDA). Discover novel deterministic and stochastic first-order sampling schemes inspired by mirror descent. Learn about the improved convergence rate of O(epsilon^{-2}d) for general target distributions with strongly convex potential, significantly advancing the current state-of-the-art. Examine the specialized algorithm for sampling from Dirichlet posteriors, featuring the first non-asymptotic O(\epsilon^{-2}d^2 R_0) rate for first-order sampling. Explore the extension of the deterministic framework to mini-batch settings and its convergence rates with stochastic gradients. Gain insights into state-of-the-art experimental results for LDA on real datasets, bridging theoretical foundations with practical applications in statistics, probability, and optimization.

Syllabus

Mirrored Langevin Dynamics - Ya-Ping Hsieh


Taught by

Alan Turing Institute

Related Courses

Computing Form and Shape: Python Programming with the Rhinoscript Library
Rhode Island School of Design via Kadenze
Algorithms and Data Structures
University of California, San Diego via edX
Learning Grasshopper
LinkedIn Learning
Stay Ahead in Architecture with Algorithmic Design
LinkedIn Learning
Learning Algorithmic Design with Grasshopper
LinkedIn Learning