Synthetic Data - Friend or Foe in the Age of Scaling?
Offered By: Institut des Hautes Etudes Scientifiques (IHES) via YouTube
Course Description
Overview
Explore the impact of synthetic data on AI and large language model scaling in this 56-minute lecture by Julia Kempe from the Institut des Hautes Etudes Scientifiques (IHES). Delve into the theoretical framework of model collapse through scaling laws, examining how the increasing presence of synthesized data in training corpora affects model improvement and performance. Discover various decay phenomena, including loss of scaling, shifted scaling across generations, skill "un-learning," and grokking when combining human and synthetic data. Learn about the validation of this theory through large-scale experiments using a transformer for arithmetic tasks and the LLM Llama2 for text generation. Gain insights into the potential future challenges and implications for AI development as synthetic data becomes more prevalent in training datasets.
Syllabus
Julia Kempe - Synthetic Data – Friend or Foe in the Age of Scaling?
Taught by
Institut des Hautes Etudes Scientifiques (IHES)
Related Courses
Linear CircuitsGeorgia Institute of Technology via Coursera مقدمة في هندسة الطاقة والقوى
King Abdulaziz University via Rwaq (رواق) Magnetic Materials and Devices
Massachusetts Institute of Technology via edX Linear Circuits 2: AC Analysis
Georgia Institute of Technology via Coursera Transmisión de energía eléctrica
Tecnológico de Monterrey via edX