YoVDO

Statistical Learning Theory and Neural Networks

Offered By: Simons Institute via YouTube

Tags

Statistical Learning Theory Courses Neural Networks Courses Gradient Descent Courses

Course Description

Overview

Explore fundamental concepts in statistical learning theory and their application to deep neural networks in this comprehensive tutorial. Delve into uniform laws of large numbers and their relationship to function class complexity. Focus on Rademacher complexity as a key measure, examining upper bounds for deep ReLU networks. Investigate the apparent contradictions between modern neural network behaviors and classical intuitions. Gain insights into neural network training from an optimization perspective, reviewing gradient descent analysis for convex and smooth objectives. Understand the Polyak-Lojasiewicz (PL) inequality and its relevance to neural network training. Examine the neural tangent kernel (NTK) regime and its approximation of neural network training. Learn two approaches to establishing PL inequalities for neural networks: a general method based on NTK approximation and a specific technique for linearly-separable data.

Syllabus

Tutorial: Statistical Learning Theory and Neural Networks I


Taught by

Simons Institute

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX