YoVDO

Statistical Learning Theory and Neural Networks II

Offered By: Simons Institute via YouTube

Tags

Statistical Learning Theory Courses Neural Networks Courses

Course Description

Overview

Delve into the second part of a comprehensive tutorial on statistical learning theory and neural networks. Explore approaches for understanding neural network training from an optimization perspective, beginning with a review of classical analysis of gradient descent on convex and smooth objectives. Examine the Polyak--Lojasiewicz (PL) inequality and its interpretation in the context of neural network training. Investigate the neural tangent kernel (NTK) regime, a particular setting where neural network training is well-approximated by kernel methods. Learn how to establish a PL inequality for neural networks using two approaches: a general method based on the NTK approximation and a specific technique for linearly-separable data. This advanced tutorial, presented by Spencer Frei from UC Berkeley as part of the Deep Learning Theory Workshop and Summer School at the Simons Institute, builds upon the foundations laid in the first session and offers deeper insights into the optimization aspects of neural network training.

Syllabus

Tutorial: Statistical Learning Theory and Neural Networks II


Taught by

Simons Institute

Related Courses

Statistical Machine Learning
Eberhard Karls University of Tübingen via YouTube
The Information Bottleneck Theory of Deep Neural Networks
Simons Institute via YouTube
Interpolation and Learning With Scale Dependent Kernels
MITCBMM via YouTube
Statistical Learning Theory and Applications - Class 16
MITCBMM via YouTube
Statistical Learning Theory and Applications - Class 6
MITCBMM via YouTube