Neural Networks Meet Nonparametric Regression: Generalization by Weight Decay and Large Learning Rates
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the intersection of neural networks and nonparametric regression in this insightful lecture by Yu-Xiang Wang from UC San Diego. Delve into the reasons behind the superior performance of overparameterized deep learning models compared to classical methods like kernels and splines. Examine how standard hyperparameter tuning in deep neural networks (DNNs) implicitly uncovers hidden sparsity and low-dimensional structures, leading to improved adaptivity. Gain new perspectives on overparameterization, representation learning, and the generalization capabilities of neural networks through optimization-algorithm induced implicit biases such as Edge-of-Stability and Minima Stability. Analyze theory and examples that illustrate how DNNs achieve adaptive and near-optimal generalization, shedding light on their effectiveness in practical applications.
Syllabus
Neural Networks meet Nonparametric Regression: Generalization by Weight Decay and Large...
Taught by
Simons Institute
Related Courses
Launching into Machine Learning 日本語版Google Cloud via Coursera Launching into Machine Learning auf Deutsch
Google Cloud via Coursera Launching into Machine Learning en Français
Google Cloud via Coursera Launching into Machine Learning en Español
Google Cloud via Coursera Основы машинного обучения
Higher School of Economics via Coursera