The Dynamics of Memorization and Generalization in Deep Learning
Offered By: Institute for Pure & Applied Mathematics (IPAM) via YouTube
Course Description
Overview
Explore the complex relationship between memorization and generalization in deep learning models through this insightful conference talk. Delve into the ubiquitous nature of memorization, examining evidence from data diets, example difficulty, and pruning techniques. Investigate whether memorization is essential for generalization and consider theoretical work suggesting its elimination may not be feasible. Discover strategies to mitigate unwanted memorization, including improved data curation and efficient unlearning mechanisms. Examine the potential of pruning techniques to selectively remove memorized examples and their impact on factual recall versus in-context learning. Gain valuable insights into the dynamics of deep learning models and their implications for AI development.
Syllabus
Gintare Karolina Dziugaite - The dynamics of memorization and generalization in deep learning
Taught by
Institute for Pure & Applied Mathematics (IPAM)
Related Courses
CMU Advanced NLP: How to Use Pre-Trained ModelsGraham Neubig via YouTube Stanford Seminar 2022 - Transformer Circuits, Induction Heads, In-Context Learning
Stanford University via YouTube Pretraining Task Diversity and the Emergence of Non-Bayesian In-Context Learning for Regression
Simons Institute via YouTube In-Context Learning: A Case Study of Simple Function Classes
Simons Institute via YouTube AI Mastery: Ultimate Crash Course in Prompt Engineering for Large Language Models
Data Science Dojo via YouTube