Strong Convergence of Inertial Algorithms via Tikhonov Regularization
Offered By: Erwin Schrödinger International Institute for Mathematics and Physics (ESI) via YouTube
Course Description
Overview
Explore the application of Tikhonov regularization in optimization algorithms during this 24-minute conference talk from the "One World Optimization Seminar in Vienna" workshop at the Erwin Schrödinger International Institute for Mathematics and Physics. Delve into the benefits of introducing Tikhonov regularization terms in minimization problem algorithms, allowing for pre-selection of equilibrium points and strong topology convergence. Examine two inertial algorithms with Tikhonov regularization: a proximal algorithm for non-smooth objective functions and a Nesterov-type algorithm for differentiable objective functions. Discover the relationship between extrapolation coefficients and Tikhonov regularization parameters, and learn about parameter settings that ensure strong convergence to the minimum norm solution while maintaining fast convergence for objective function values. Investigate an inertial gradient-type algorithm with dual Tikhonov regularization terms, exploring its strong convergence properties and Nesterov-type rates. Understand the crucial role of both regularization terms in achieving convergence to the minimal norm solution.
Syllabus
Szilard Csaba Laszlo - On strong convergence of inertial algorithms via Tikhonov regularization
Taught by
Erwin Schrödinger International Institute for Mathematics and Physics (ESI)
Related Courses
Deep Learning for Natural Language ProcessingUniversity of Oxford via Independent Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization
DeepLearning.AI via Coursera Deep Learning Part 1 (IITM)
Indian Institute of Technology Madras via Swayam Deep Learning - Part 1
Indian Institute of Technology, Ropar via Swayam Logistic Regression with Python and Numpy
Coursera Project Network via Coursera