Scaling Data-Constrained Language Models
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the challenges and solutions for scaling language models in data-constrained environments through this insightful lecture by Sasha Rush from Cornell University and Hugging Face. Delve into empirical experiments that investigate the impact of data repetition and compute budget on large language models. Discover a proposed scaling law for compute optimality that addresses the diminishing returns of repeated tokens and excess parameters. Examine approaches to mitigate data scarcity as the availability of internet text data becomes a limiting factor in training dataset size for LLMs. Gain valuable insights into the future of language model development and the strategies for overcoming data constraints in the field of artificial intelligence and natural language processing.
Syllabus
Scaling Data-Constrained Language Models
Taught by
Simons Institute
Related Courses
Microsoft Bot Framework and Conversation as a PlatformMicrosoft via edX Unlocking the Power of OpenAI for Startups - Microsoft for Startups
Microsoft via YouTube Improving Customer Experiences with Speech to Text and Text to Speech
Microsoft via YouTube Stanford Seminar - Deep Learning in Speech Recognition
Stanford University via YouTube Select Topics in Python: Natural Language Processing
Codio via Coursera