Coordinate Descent Methods Beyond Separability and Smoothness
Offered By: Erwin Schrödinger International Institute for Mathematics and Physics (ESI) via YouTube
Course Description
Overview
Explore coordinate descent methods for optimization problems that go beyond traditional separability and smoothness constraints in this 30-minute talk by Ion Necoara at the Erwin Schrödinger International Institute for Mathematics and Physics. Delve into techniques for handling nonseparable and nonsmooth objective functions, including random coordinate proximal gradient methods and smooth approximation frameworks. Learn about the scalability of these algorithms through local approximation models along random subspaces. Examine the worst-case complexity analysis for both convex and nonconvex settings. Discover the practical applications of these methods in areas such as smallest eigenvalue problems, matrix factorization, and support vector machine classification.
Syllabus
Ion Necoara - Coordinate descent methods beyond separability and smoothness
Taught by
Erwin Schrödinger International Institute for Mathematics and Physics (ESI)
Related Courses
Convex OptimizationStanford University via edX FA19: Deterministic Optimization
Georgia Institute of Technology via edX Applied Optimization For Wireless, Machine Learning, Big Data
Indian Institute of Technology Kanpur via Swayam Statistical Machine Learning
Eberhard Karls University of Tübingen via YouTube Convex Optimization
NIOS via YouTube