YoVDO

The Elusive Generalization: Classical Bounds to Double Descent to Grokking

Offered By: Simons Institute via YouTube

Tags

Machine Learning Courses Neural Networks Courses Interpolation Courses Overfitting Courses Generalization Courses Early Stopping Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the evolution of generalization in machine learning through this comprehensive lecture by Misha Belkin from the University of California, San Diego. Delve into recent developments that have challenged traditional theoretical foundations, including empirical findings in neural networks. Examine the limitations of using training loss as a proxy for test loss and understand the implications of phenomena such as interpolation and double descent. Investigate the practice of early stopping and its potential shortcomings in light of emergent phenomena like grokking. Analyze the fundamental challenges these discoveries present to both the theory and practice of machine learning, and gain insights into the current state of understanding in the field.

Syllabus

The elusive generalization: classical bounds to double descent to grokking


Taught by

Simons Institute

Related Courses

Digital Signal Processing
École Polytechnique Fédérale de Lausanne via Coursera
Computational Science and Engineering using Python
Indian Institute of Technology, Kharagpur via Swayam
Computational Thinking for Modeling and Simulation
Massachusetts Institute of Technology via edX
Introduction to numerical analysis
Higher School of Economics via Coursera
Métodos numéricos para matemáticas con Octave
Universitat Politècnica de València via edX