Toward Understanding In-Context Learning
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the fascinating world of in-context learning in large language models through this 90-minute lecture by Tengyu Ma from Stanford University. Delve into the remarkable ability of these models to tackle downstream tasks by conditioning on prompts with input-output examples, without requiring parameter updates. Examine several research papers that provide theoretical explanations for in-context learning mechanisms using simplified data distributions. Gain valuable insights into this cutting-edge topic as part of the Special Year on Large Language Models and Transformers: Part 1 Boot Camp at the Simons Institute.
Syllabus
Toward Understanding In-context Learning
Taught by
Simons Institute
Related Courses
Latent State Recovery in Reinforcement Learning - John LangfordInstitute for Advanced Study via YouTube On the Critic Function of Implicit Generative Models - Arthur Gretton
Institute for Advanced Study via YouTube Priors for Semantic Variables - Yoshua Bengio
Institute for Advanced Study via YouTube Instance-Hiding Schemes for Private Distributed Learning
Institute for Advanced Study via YouTube Learning Probability Distributions - What Can, What Can't Be Done - Shai Ben-David
Institute for Advanced Study via YouTube