YoVDO

Nonnegative Polynomials, Nonconvex Polynomial Optimization, and Applications to Learning

Offered By: Simons Institute via YouTube

Tags

Polynomials Courses

Course Description

Overview

Explore a comprehensive lecture on nonnegative polynomials, nonconvex polynomial optimization, and their applications in learning. Delve into shape-constrained regression, difference of convex (DC) programming, and monotone regression, including problem definitions and NP-hardness. Examine SOS relaxation techniques, approximation theorems, and numerical experiments in low noise environments. Investigate DC decomposition, the Convex-Concave Procedure (CCP), and strategies for selecting optimal decompositions. Gain insights into undominated decompositions and learn to compare different approaches. Understand the wide-ranging applications of optimization over nonnegative polynomials and the power of SDP/SOS-based relaxations in this field.

Syllabus

Intro
Optimizing over nonnegative polynomials
1. Shape-constrained regression
2. Difference of Convex (DC) programming Problems of the form min fo (x)
Monotone regression: problem definition
NP-hardness and SOS relaxation
Approximation theorem
Numerical experiments (1/2) • Low noise environment
Difference of Convex (dc) decomposition
Existence of dc decomposition (2/3)
Convex-Concave Procedure (CCP)
Picking the "best" decomposition for CCP
Undominated decompositions (1/2)
Comparing different decompositions (1/2)
Main messages • Optimization over nonnegative polynomials has many applications Powerful SDP/SOS-based relaxations available.
Uniqueness of dc decomposition


Taught by

Simons Institute

Related Courses

Intermediate Algebra
University of California, Irvine via Coursera
Visualizing Algebra
San Jose State University via Udacity
College Algebra
San Jose State University via Udacity
Комбинаторика для начинающих
Moscow Institute of Physics and Technology via Coursera
Álgebra básica
Universidad Nacional Autónoma de México via Coursera