YoVDO

Size of Teachers as Measure of Data Complexity - PAC-Bayes Excess Risk Bounds and Scaling Laws

Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube

Tags

Neural Networks Courses Scaling Laws Courses Generalization Courses Sample Complexity Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 57-minute lecture by Dan Roy from the University of Toronto on "The Size of Teachers as a Measure of Data Complexity: PAC-Bayes Excess Risk Bounds and Scaling Laws." Recorded at IPAM's Theory and Practice of Deep Learning Workshop on October 16, 2024, delve into the generalization properties of randomly initialized neural networks. Examine the extension of Buzaglo et al.'s 2024 analysis to student networks of any width and depth, and scenarios where small teacher networks don't perfectly interpolate data. Discover an oracle inequality relating Gibbs posterior sampling risk to narrow teacher networks, and learn how sample complexity is bounded by small, low-risk teacher networks. Investigate a new data complexity measure based on minimal teacher network size for specific excess risk levels. Compare resulting scaling laws with empirical studies to estimate data complexity of standard benchmarks.

Syllabus

Dan Roy - Size of Teachers as Measure of Data Complexity: PAC-Bayes Excess Risk Bounds & Scaling Law


Taught by

Institute for Pure & Applied Mathematics (IPAM)

Related Courses

Introduction To Mechanical Micro Machining
Indian Institute of Technology, Kharagpur via Swayam
Biomaterials - Intro to Biomedical Engineering
Udemy
OpenAI Whisper - Robust Speech Recognition via Large-Scale Weak Supervision
Aleksa Gordić - The AI Epiphany via YouTube
Turbulence as Gibbs Statistics of Vortex Sheets - Alexander Migdal
Institute for Advanced Study via YouTube
City Analytics - Professor Peter Grindrod CBE
Alan Turing Institute via YouTube