YoVDO

Can Non-Convex Optimization Be Robust?

Offered By: Simons Institute via YouTube

Tags

Matrix Completion Courses High-dimensional Statistics Courses

Course Description

Overview

Explore the challenges and possibilities of robust non-convex optimization in this 46-minute lecture by Rong Ge from Duke University. Delve into the reasons behind the perceived ease of non-convex optimization, examine locally optimizable functions, and investigate the consequences of failed assumptions. Learn about robust non-convex optimization techniques using perturbed objectives, and understand the motivation behind comparing empirical risk to population risk. Discover the concept of smoothing and its properties, as well as ideas for establishing lower bounds. Examine matrix completion, semi-random adversaries, and counter-examples. Gain insights into preprocessing techniques and conclude with a summary of key points and open problems in the field of robust and high-dimensional statistics.

Syllabus

Intro
Why is non-convex optimization "easy"?
Locally optimizable functions
What happens when assumptions fail?
Robust non-convex optimization with perturbed objective
Motivation: Empirical Risk vs. Population Risk.
Idea: Smoothing
Properties of Smoothing
Ideas of the Lower Bound
Matrix Completion
Semi-Random Adversary
Counter Examples
Preprocessing
Summary
Open Problems


Taught by

Simons Institute

Related Courses

Analyzing Optimization and Generalization in Deep Learning via Dynamics of Gradient Descent
Simons Institute via YouTube
Implicit Regularization I
Simons Institute via YouTube
Finding Low-Rank Matrices - From Matrix Completion to Recent Trends
Simons Institute via YouTube
Power of Active Sampling for Unsupervised Learning
Simons Institute via YouTube
Is Optimization the Right Language to Understand Deep Learning? - Sanjeev Arora
Institute for Advanced Study via YouTube