YoVDO

Surprising Phenomena of Max-LP-Margin Classifiers in High Dimensions

Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube

Tags

Machine Learning Courses Neural Networks Courses Interpolation Courses Classification Courses High-dimensional Statistics Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 53-minute conference talk by Fanny Yang from ETH Zurich on "Surprising phenomena of max-lp-margin classifiers in high dimensions" presented at IPAM's Theory and Practice of Deep Learning Workshop. Delve into the analysis of max-lp-margin classifiers, examining their relevance to the implicit bias of first-order methods and harmless interpolation in neural networks. Discover unexpected findings in the noiseless case, where minimizing l1-norm achieves optimal rates for regression with hard-sparse ground truths, but this adaptivity doesn't directly apply to max l1-margin classification. Investigate how max-lp-margin classifiers can achieve 1/√n rates for p slightly larger than one in noisy observations, while maximum l1-margin classifiers only achieve rates of order 1/√(log(d/n)). Gain insights into cutting-edge research in machine learning theory and its implications for deep learning practices.

Syllabus

Fanny Yang - Surprising phenomena of max-lp-margin classifiers in high dimensions - IPAM at UCLA


Taught by

Institute for Pure & Applied Mathematics (IPAM)

Related Courses

Digital Signal Processing
École Polytechnique Fédérale de Lausanne via Coursera
Computational Science and Engineering using Python
Indian Institute of Technology, Kharagpur via Swayam
Computational Thinking for Modeling and Simulation
Massachusetts Institute of Technology via edX
Introduction to numerical analysis
Higher School of Economics via Coursera
Métodos numéricos para matemáticas con Octave
Universitat Politècnica de València via edX