YoVDO

Coordinate Descent Methods Beyond Separability and Smoothness

Offered By: Erwin Schrödinger International Institute for Mathematics and Physics (ESI) via YouTube

Tags

Convex Optimization Courses Nonconvex Optimization Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore coordinate descent methods for optimization problems that go beyond traditional separability and smoothness constraints in this 30-minute talk by Ion Necoara at the Erwin Schrödinger International Institute for Mathematics and Physics. Delve into techniques for handling nonseparable and nonsmooth objective functions, including random coordinate proximal gradient methods and smooth approximation frameworks. Learn about the scalability of these algorithms through local approximation models along random subspaces. Examine the worst-case complexity analysis for both convex and nonconvex settings. Discover the practical applications of these methods in areas such as smallest eigenvalue problems, matrix factorization, and support vector machine classification.

Syllabus

Ion Necoara - Coordinate descent methods beyond separability and smoothness


Taught by

Erwin Schrödinger International Institute for Mathematics and Physics (ESI)

Related Courses

On Gradient-Based Optimization - Accelerated, Distributed, Asynchronous and Stochastic
Simons Institute via YouTube
Optimisation - An Introduction: Professor Coralia Cartis, University of Oxford
Alan Turing Institute via YouTube
Optimization in Signal Processing and Machine Learning
IEEE Signal Processing Society via YouTube
Methods for L_p-L_q Minimization in Image Restoration and Regression - SIAM-IS Seminar
Society for Industrial and Applied Mathematics via YouTube
Certificates of Nonnegativity and Their Applications in Theoretical Computer Science
Society for Industrial and Applied Mathematics via YouTube