Inference and Uncertainty Quantification for Noisy Matrix Completion - Lecture 5
Offered By: Georgia Tech Research via YouTube
Course Description
Overview
Explore a comprehensive lecture on inference and uncertainty quantification for noisy matrix completion, part of the TRIAD Distinguished Lecture Series delivered by Yuxin Chen from Princeton University. Delve into the development of a simple procedure to compensate for bias in convex and nonconvex estimators, leading to nearly precise non-asymptotic distributional characterizations. Learn how these findings enable optimal construction of confidence intervals and regions for missing entries and low-rank factors. Discover the advantages of this approach, which avoids sample splitting and unnecessary loss of data efficiency. Gain insights into the sharp characterization of estimation accuracy for de-biased estimators, marking them as the first tractable algorithms to achieve full statistical efficiency. Examine the intimate link between convex and nonconvex optimization that underpins this analysis, presented as joint work with Cong Ma, Yuling Yan, Yuejie Chi, and Jianqing Fan.
Syllabus
TRIAD Distinguished Lecture Series | Yuxin Chen | Princeton University
Taught by
Georgia Tech Research
Related Courses
Analyzing Optimization and Generalization in Deep Learning via Dynamics of Gradient DescentSimons Institute via YouTube Implicit Regularization I
Simons Institute via YouTube Can Non-Convex Optimization Be Robust?
Simons Institute via YouTube Finding Low-Rank Matrices - From Matrix Completion to Recent Trends
Simons Institute via YouTube Power of Active Sampling for Unsupervised Learning
Simons Institute via YouTube