YoVDO

Statistical Learning Theory and Neural Networks

Offered By: Simons Institute via YouTube

Tags

Statistical Learning Theory Courses Neural Networks Courses Gradient Descent Courses

Course Description

Overview

Explore fundamental concepts in statistical learning theory and their application to deep neural networks in this comprehensive tutorial. Delve into uniform laws of large numbers and their relationship to function class complexity. Focus on Rademacher complexity as a key measure, examining upper bounds for deep ReLU networks. Investigate the apparent contradictions between modern neural network behaviors and classical intuitions. Gain insights into neural network training from an optimization perspective, reviewing gradient descent analysis for convex and smooth objectives. Understand the Polyak-Lojasiewicz (PL) inequality and its relevance to neural network training. Examine the neural tangent kernel (NTK) regime and its approximation of neural network training. Learn two approaches to establishing PL inequalities for neural networks: a general method based on NTK approximation and a specific technique for linearly-separable data.

Syllabus

Tutorial: Statistical Learning Theory and Neural Networks I


Taught by

Simons Institute

Related Courses

Practical Predictive Analytics: Models and Methods
University of Washington via Coursera
Deep Learning Fundamentals with Keras
IBM via edX
Introduction to Machine Learning
Duke University via Coursera
Intro to Deep Learning with PyTorch
Facebook via Udacity
Introduction to Machine Learning for Coders!
fast.ai via Independent