YoVDO

Random Initialization and Implicit Regularization in Nonconvex Statistical Estimation - Lecture 2

Offered By: Georgia Tech Research via YouTube

Tags

Nonconvex Optimization Courses Gradient Descent Courses Implicit Regularization Courses Matrix Completion Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the second lecture in a five-part series featuring Princeton University's Yuxin Chen, focusing on random initialization and implicit regularization in nonconvex statistical estimation. Delve into the phenomenon where gradient descent converges to optimal solutions in nonconvex problems like phase retrieval and matrix completion, achieving near-optimal statistical and computational guarantees without careful initialization or explicit regularization. Examine the leave-one-out approach used to decouple statistical dependency between gradient descent iterates and data. Learn about the application of this method to noisy matrix completion, demonstrating near-optimal entrywise error control. Investigate topics such as low-rank matrix recovery, quadratic systems of equations, two-stage approaches, population-level state evolution, and automatic saddle avoidance in this 48-minute talk from the TRIAD Distinguished Lecture Series at Georgia Tech Research.

Syllabus

Intro
Statistical models come to rescue
Example: low-rank matrix recovery
Solving quadratic systems of equations
A natural least squares formulation
Rationale of two-stage approach
What does prior theory say?
Exponential growth of signal strength in Stage 1
Our theory: noiseless case
Population-level state evolution
Back to finite-sample analysis
Gradient descent theory revisited
A second look at gradient descent theory
Key proof idea: leave-one-out analysis
Key proof ingredient: random-sign sequences
Automatic saddle avoidance


Taught by

Georgia Tech Research

Related Courses

On Gradient-Based Optimization - Accelerated, Distributed, Asynchronous and Stochastic
Simons Institute via YouTube
Optimisation - An Introduction: Professor Coralia Cartis, University of Oxford
Alan Turing Institute via YouTube
Optimization in Signal Processing and Machine Learning
IEEE Signal Processing Society via YouTube
Methods for L_p-L_q Minimization in Image Restoration and Regression - SIAM-IS Seminar
Society for Industrial and Applied Mathematics via YouTube
Certificates of Nonnegativity and Their Applications in Theoretical Computer Science
Society for Industrial and Applied Mathematics via YouTube