Scaling Data-Constrained Language Models
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the challenges and solutions for scaling language models in data-constrained environments through this insightful lecture by Sasha Rush from Cornell University and Hugging Face. Delve into empirical experiments that investigate the impact of data repetition and compute budget on large language models. Discover a proposed scaling law for compute optimality that addresses the diminishing returns of repeated tokens and excess parameters. Examine approaches to mitigate data scarcity as the availability of internet text data becomes a limiting factor in training dataset size for LLMs. Gain valuable insights into the future of language model development and the strategies for overcoming data constraints in the field of artificial intelligence and natural language processing.
Syllabus
Scaling Data-Constrained Language Models
Taught by
Simons Institute
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Natural Language Processing
Columbia University via Coursera Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent