YoVDO

The Mathematics of Scaling Laws and Model Collapse in AI

Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube

Tags

Scaling Laws Courses Artificial Intelligence Courses Data Science Courses Machine Learning Courses Deep Learning Courses Neural Networks Courses ChatGPT Courses LLaMA (Large Language Model Meta AI) Courses Random Matrix Theory Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the mathematics behind scaling laws and model collapse in AI through this hour-long lecture presented by Elvis Dohmatob of Meta Paris at IPAM's Theory and Practice of Deep Learning Workshop. Delve into the phenomenon of model collapse as AI systems like ChatGPT and Llama grow in size and capability, contributing to their own training datasets. Examine how the previously linear relationship between model performance and training data size eventually flattens out, leading to diminishing returns. Gain insights into the fundamental changes in scaling laws and their implications for AI development. Learn about the key results of recent research and the mathematical concepts involved, including classical random matrix theory. Understand the collaborative efforts behind this work, involving researchers from NYU, Meta, Peking University, and more.

Syllabus

Elvis Dohmatob - The Mathematics of Scaling Laws and Model Collapse in AI - IPAM at UCLA


Taught by

Institute for Pure & Applied Mathematics (IPAM)

Related Courses

LLaMA- Open and Efficient Foundation Language Models - Paper Explained
Yannic Kilcher via YouTube
Alpaca & LLaMA - Can it Compete with ChatGPT?
Venelin Valkov via YouTube
Experimenting with Alpaca & LLaMA
Aladdin Persson via YouTube
What's LLaMA? ChatLLaMA? - And Some ChatGPT/InstructGPT
Aladdin Persson via YouTube
Llama Index - Step by Step Introduction
echohive via YouTube