Evaluation Complexity of Algorithms for Nonconvex Optimization
Offered By: International Mathematical Union via YouTube
Course Description
Overview
Explore an in-depth analysis of global convergence rates and worst-case evaluation complexity for nonconvex smooth optimization methods in this 46-minute lecture by Coralia Cartis. Discover how steepest descent and Newton's methods achieve similar sharp performance bounds, and learn about the advantages of second-order regularization techniques. Examine the benefits of incorporating higher-order derivative information in regularization frameworks, leading to improved complexity, universal properties, and higher-order criticality certification. Investigate inexact settings with occasionally accurate derivatives and function evaluations, and their quantifiable worst-case complexity. Gain insights into robust optimization methods with varying, sharp, and sometimes optimal complexity across different scenarios.
Syllabus
Coralia Cartis: Evaluation complexity of algorithms for nonconvex optimization
Taught by
International Mathematical Union
Related Courses
On Gradient-Based Optimization - Accelerated, Distributed, Asynchronous and StochasticSimons Institute via YouTube Optimisation - An Introduction: Professor Coralia Cartis, University of Oxford
Alan Turing Institute via YouTube Optimization in Signal Processing and Machine Learning
IEEE Signal Processing Society via YouTube Methods for L_p-L_q Minimization in Image Restoration and Regression - SIAM-IS Seminar
Society for Industrial and Applied Mathematics via YouTube Certificates of Nonnegativity and Their Applications in Theoretical Computer Science
Society for Industrial and Applied Mathematics via YouTube