YoVDO

Global Convergence and Asymptotic Optimality of the Heavy Ball Method

Offered By: GERAD Research Center via YouTube

Tags

Optimization Algorithms Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intricacies of optimization algorithms in this 48-minute seminar from GERAD Research Center. Delve into the urban legend surrounding the "complexity lower bound" for strongly convex functions with Lipschitz gradients. Revisit Polyak's original heavy-ball algorithm and examine the conditions necessary for its global convergence. Gain insights from Iman Shames of The Australian National University as he presents his research on the heavy ball method's global convergence and asymptotic optimality.

Syllabus

Global Convergence and Asymptotic Optimality of the Heavy Ball Method, Iman Shames


Taught by

GERAD Research Center

Related Courses

Deep Learning for Natural Language Processing
University of Oxford via Independent
Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization
DeepLearning.AI via Coursera
Deep Learning Part 1 (IITM)
Indian Institute of Technology Madras via Swayam
Deep Learning - Part 1
Indian Institute of Technology, Ropar via Swayam
Logistic Regression with Python and Numpy
Coursera Project Network via Coursera