Reconsidering Overfitting in the Age of Overparameterized Models
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the intriguing phenomenon of overparameterized models in machine learning through this 1-hour 18-minute lecture by Fanny Yang from ETH Zurich. Delve into the paradox of large neural networks that achieve near-zero error on noisy datasets while still generalizing well to unseen data, challenging traditional notions of overfitting. Examine recent statistical literature that offers theoretical insights into this phenomenon, focusing on linear models. Gain a new perspective on overfitting and generalization that aligns with empirical observations in modern machine learning practices. Part of the Modern Paradigms in Generalization Boot Camp at the Simons Institute, this talk provides crucial understanding for navigating the complexities of training and deploying large-scale machine learning models.
Syllabus
Reconsidering Overfitting in the Age of Overparameterized Models
Taught by
Simons Institute
Related Courses
Statistical Machine LearningEberhard Karls University of Tübingen via YouTube The Information Bottleneck Theory of Deep Neural Networks
Simons Institute via YouTube Interpolation and Learning With Scale Dependent Kernels
MITCBMM via YouTube Statistical Learning Theory and Applications - Class 16
MITCBMM via YouTube Statistical Learning Theory and Applications - Class 6
MITCBMM via YouTube