Size of Teachers as Measure of Data Complexity - PAC-Bayes Excess Risk Bounds and Scaling Laws
Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube
Course Description
Overview
Explore a 57-minute lecture by Dan Roy from the University of Toronto on "The Size of Teachers as a Measure of Data Complexity: PAC-Bayes Excess Risk Bounds and Scaling Laws." Recorded at IPAM's Theory and Practice of Deep Learning Workshop on October 16, 2024, delve into the generalization properties of randomly initialized neural networks. Examine the extension of Buzaglo et al.'s 2024 analysis to student networks of any width and depth, and scenarios where small teacher networks don't perfectly interpolate data. Discover an oracle inequality relating Gibbs posterior sampling risk to narrow teacher networks, and learn how sample complexity is bounded by small, low-risk teacher networks. Investigate a new data complexity measure based on minimal teacher network size for specific excess risk levels. Compare resulting scaling laws with empirical studies to estimate data complexity of standard benchmarks.
Syllabus
Dan Roy - Size of Teachers as Measure of Data Complexity: PAC-Bayes Excess Risk Bounds & Scaling Law
Taught by
Institute for Pure & Applied Mathematics (IPAM)
Related Courses
Launching into Machine Learning 日本語版Google Cloud via Coursera Launching into Machine Learning auf Deutsch
Google Cloud via Coursera Launching into Machine Learning en Français
Google Cloud via Coursera Launching into Machine Learning en Español
Google Cloud via Coursera Основы машинного обучения
Higher School of Economics via Coursera