The Mathematics of Scaling Laws and Model Collapse in AI
Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube
Course Description
Overview
Explore the mathematics behind scaling laws and model collapse in AI through this hour-long lecture presented by Elvis Dohmatob of Meta Paris at IPAM's Theory and Practice of Deep Learning Workshop. Delve into the phenomenon of model collapse as AI systems like ChatGPT and Llama grow in size and capability, contributing to their own training datasets. Examine how the previously linear relationship between model performance and training data size eventually flattens out, leading to diminishing returns. Gain insights into the fundamental changes in scaling laws and their implications for AI development. Learn about the key results of recent research and the mathematical concepts involved, including classical random matrix theory. Understand the collaborative efforts behind this work, involving researchers from NYU, Meta, Peking University, and more.
Syllabus
Elvis Dohmatob - The Mathematics of Scaling Laws and Model Collapse in AI - IPAM at UCLA
Taught by
Institute for Pure & Applied Mathematics (IPAM)
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Artificial Intelligence for Robotics
Stanford University via Udacity Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent