YoVDO

Nonmonotone Forward-Backward Splitting Method for Infinite-Dimensional Optimization Problems

Offered By: Erwin Schrödinger International Institute for Mathematics and Physics (ESI) via YouTube

Tags

Hilbert Spaces Courses Complexity Theory Courses Partial Differential Equations Courses Convex Functions Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 23-minute conference talk from the "One World Optimization Seminar in Vienna" workshop held at the Erwin Schrödinger International Institute for Mathematics and Physics (ESI). Delve into the convergence analysis of a nonmonotone forward-backward splitting method for tackling nonsmooth composite problems in Hilbert spaces. Examine the objective function, which combines a Fréchet differentiable function and a lower semicontinuous convex function, commonly found in optimization problems involving nonlinear partial differential equations with sparsity-promoting cost functionals. Learn about the algorithm's convergence and complexity, including linear convergence under quadratic growth-type conditions. Gain insights from numerical experiments that validate the theoretical findings presented in this advanced mathematical optimization talk.

Syllabus

Behzad Azmi - Nonmonotone Forward-Backward Splitting Method for a Class of Infinite-Dimensional....


Taught by

Erwin Schrödinger International Institute for Mathematics and Physics (ESI)

Related Courses

Convex Optimization
Stanford University via edX
Non Linear Programming
NIOS via YouTube
A Primer to Mathematical Optimization
Indian Institute of Technology (BHU) Varanasi via Swayam
Gradient Descent and Stochastic Gradient Descent
Paul Hand via YouTube
Operator Scaling via Geodesically Convex Optimization, Invariant Theory and Polynomial Identity Testing - Yuanzhi Li
Institute for Advanced Study via YouTube