Does Learning Require Memorization? A Short Tale About a Long Tail
Offered By: Association for Computing Machinery (ACM) via YouTube
Course Description
Overview
Explore the relationship between learning and memorization in machine learning through a thought-provoking 25-minute ACM conference talk. Delve into concepts such as overfitting, label memorization, and the role of theory in learning. Examine the importance of interpolation, hard atypical examples, and subpopulations in dataset analysis. Investigate the challenges posed by long-tailed data distributions and their impact on model performance. Discuss the benefits and potential drawbacks of fitting data, including its connection to memorization. Extend the analysis beyond discrete domains and explore the concept of coupling. Review experimental validation conducted with Chiyuan Zhang, and draw insightful conclusions about the nature of learning and memorization in artificial intelligence.
Syllabus
Intro
(Over?)fitting the dataset
Label memorization
What about theory?
Interpolation
Hard atypical examples
Subpopulations
Long-tailed data
The tail rears its head
Model
Benefits of fitting
Fitting and memorization
Beyond discrete domains
Coupling
Experimental validation (with Chiyuan Zhang)
Conclusions
Taught by
Association for Computing Machinery (ACM)
Related Courses
Practical Machine LearningJohns Hopkins University via Coursera Practical Deep Learning For Coders
fast.ai via Independent 機器學習基石下 (Machine Learning Foundations)---Algorithmic Foundations
National Taiwan University via Coursera Data Analytics Foundations for Accountancy II
University of Illinois at Urbana-Champaign via Coursera Entraînez un modèle prédictif linéaire
CentraleSupélec via OpenClassrooms