How to Escape Saddle Points Efficiently? by Praneeth Netrapalli
Offered By: International Centre for Theoretical Sciences via YouTube
Course Description
Overview
Explore the intricacies of escaping saddle points efficiently in this 51-minute conference talk by Praneeth Netrapalli at the International Centre for Theoretical Sciences. Delve into non-convex optimization techniques, examining two major observations and the current state of the art. Gain insights into perturbed gradient descent, analyzing its application in two-dimensional, three-dimensional, and general quadratic cases. Understand the key ingredients and proof ideas behind efficient saddle point escape methods. Conclude with a discussion on open questions in the field, providing a comprehensive overview of this crucial topic in algorithms and optimization.
Syllabus
Intro
Non-convex optimization
Two major observations
State of the art
Summary of results
Setting
Perturbed gradient descent
Key question
Two dimensional quadratic case
Three dimensional quadratic case
General case
Two key ingredients of the proof
Proof idea
Putting everything together
Open questions
Taught by
International Centre for Theoretical Sciences
Related Courses
Convex OptimizationStanford University via edX FA19: Deterministic Optimization
Georgia Institute of Technology via edX Applied Optimization For Wireless, Machine Learning, Big Data
Indian Institute of Technology Kanpur via Swayam Statistical Machine Learning
Eberhard Karls University of Tübingen via YouTube Convex Optimization
NIOS via YouTube