From Classical Statistics to Modern Machine Learning
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the evolution from classical statistics to modern machine learning in this 50-minute lecture by Mikhail Belkin from The Ohio State University. Delve into supervised ML, generalization bounds, and the classical U-shaped generalization curve. Examine the concept of interpolation in deep learning, addressing whether it leads to overfitting and its effectiveness even with noisy data. Investigate the "double descent" risk curve, its mechanisms, and implications for linear regression. Analyze the landscape of generalization, optimization under interpolation, and the power of interpolation in modern ML techniques. Gain insights into fast and effective kernel machines inspired by deep learning, and understand the key points in the transition from classical statistical approaches to contemporary machine learning methodologies.
Syllabus
Intro
Supervised ML
Generalization bounds
Classical U-shaped generalization curve
Does interpolation overfit?
Interpolation does not overfit even for very noisy data
Deep learning practice
Generalization theory for interpolation?
A way forward?
Interpolated k-NN schemes
Interpolation and adversarial examples
"Double descent" risk curve
what is the mechanism?
Double Descent in Linear regression
Occams's razor
The landscape of generalization
where is the interpolation threshold?
Optimization under interpolation
SGD under interpolation
The power of interpolation
Learning from deep learning: fast and effective kernel machines
Important points
From classical statistics to modern ML
Taught by
Simons Institute
Related Courses
Machine LearningUniversity of Washington via Coursera Machine Learning
Stanford University via Coursera Machine Learning
Georgia Institute of Technology via Udacity Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity