The Elusive Generalization: Classical Bounds to Double Descent to Grokking
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the evolution of generalization in machine learning through this comprehensive lecture by Misha Belkin from the University of California, San Diego. Delve into recent developments that have challenged traditional theoretical foundations, including empirical findings in neural networks. Examine the limitations of using training loss as a proxy for test loss and understand the implications of phenomena such as interpolation and double descent. Investigate the practice of early stopping and its potential shortcomings in light of emergent phenomena like grokking. Analyze the fundamental challenges these discoveries present to both the theory and practice of machine learning, and gain insights into the current state of understanding in the field.
Syllabus
The elusive generalization: classical bounds to double descent to grokking
Taught by
Simons Institute
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX