YoVDO

Computer-Aided Lyapunov Analyses and Counter-Examples to the Convergence of First-Order Optimization Methods

Offered By: Erwin Schrödinger International Institute for Mathematics and Physics (ESI) via YouTube

Tags

Optimization Algorithms Courses Gradient Descent Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore computer-aided Lyapunov analyses and counter-examples to the convergence of first-order optimization methods in this 33-minute conference talk by Adrien Taylor at the Erwin Schrödinger International Institute for Mathematics and Physics. Delve into constructive approaches for discovering Lyapunov functions and their structural properties in the context of first-order optimization algorithms. Learn about methodologies for creating counter-examples when no such Lyapunov functions exist. Examine example-based analyses of simple optimization algorithms like gradient descent, the heavy-ball method, and the Chambolle-Pock algorithm. Gain insights from joint research works on automated convergence guarantees, tight Lyapunov analysis, and provable non-accelerations in optimization methods.

Syllabus

Adrien Taylor - Computer-aided Lyapunov analyses & counter-examples to the convergence of first...


Taught by

Erwin Schrödinger International Institute for Mathematics and Physics (ESI)

Related Courses

Deep Learning for Natural Language Processing
University of Oxford via Independent
Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization
DeepLearning.AI via Coursera
Deep Learning Part 1 (IITM)
Indian Institute of Technology Madras via Swayam
Deep Learning - Part 1
Indian Institute of Technology, Ropar via Swayam
Logistic Regression with Python and Numpy
Coursera Project Network via Coursera