YoVDO

On the Variance and Admissibility of Empirical Risk Minimization on Convex Classes

Offered By: Hausdorff Center for Mathematics via YouTube

Tags

Statistical Learning Theory Courses Convex Optimization Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the concept of empirical risk minimization (ERM) in estimating unknown functions from noisy samples in this 51-minute lecture by Eli Putterman at the Hausdorff Center for Mathematics. Delve into the challenges of minimizing expected error when estimating functions belonging to a known class. Examine why ERM, despite its intuitive appeal, can be minimax suboptimal for certain function classes. Discover recent findings showing that ERM's variance is always minimax optimal under mild assumptions, implying that suboptimality must stem from bias. Learn about the proof technique involving concentration of measure for Lipschitz functions on Gauss space. If time allows, gain insights into how these results provide a new proof for Chatterjee's theorem on ERM's admissibility as an estimator. This talk, based on joint work with Gil Kur and Alexander Rakhlin, provides all necessary statistical background for a comprehensive understanding of the topic.

Syllabus

Eli Putterman: On the variance and admissibility of empirical risk minimization on convex classes


Taught by

Hausdorff Center for Mathematics

Related Courses

Convex Optimization
Stanford University via edX
FA19: Deterministic Optimization
Georgia Institute of Technology via edX
Applied Optimization For Wireless, Machine Learning, Big Data
Indian Institute of Technology Kanpur via Swayam
Statistical Machine Learning
Eberhard Karls University of Tübingen via YouTube
Convex Optimization
NIOS via YouTube