Size of Teachers as Measure of Data Complexity - PAC-Bayes Excess Risk Bounds and Scaling Laws
Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube
Course Description
Overview
Explore a 57-minute lecture by Dan Roy from the University of Toronto on "The Size of Teachers as a Measure of Data Complexity: PAC-Bayes Excess Risk Bounds and Scaling Laws." Recorded at IPAM's Theory and Practice of Deep Learning Workshop on October 16, 2024, delve into the generalization properties of randomly initialized neural networks. Examine the extension of Buzaglo et al.'s 2024 analysis to student networks of any width and depth, and scenarios where small teacher networks don't perfectly interpolate data. Discover an oracle inequality relating Gibbs posterior sampling risk to narrow teacher networks, and learn how sample complexity is bounded by small, low-risk teacher networks. Investigate a new data complexity measure based on minimal teacher network size for specific excess risk levels. Compare resulting scaling laws with empirical studies to estimate data complexity of standard benchmarks.
Syllabus
Dan Roy - Size of Teachers as Measure of Data Complexity: PAC-Bayes Excess Risk Bounds & Scaling Law
Taught by
Institute for Pure & Applied Mathematics (IPAM)
Related Courses
Beyond Worst-Case Analysis - Panel DiscussionSimons Institute via YouTube Reinforcement Learning - Part I
Simons Institute via YouTube Reinforcement Learning in Feature Space: Complexity and Regret
Simons Institute via YouTube Exploration with Limited Memory - Streaming Algorithms for Coin Tossing, Noisy Comparisons, and Multi-Armed Bandits
Association for Computing Machinery (ACM) via YouTube Optimal Transport for Machine Learning - Gabriel Peyre, Ecole Normale Superieure
Alan Turing Institute via YouTube