Supermasks in Superposition - Paper Explained
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore an in-depth analysis of the research paper "Supermasks in Superposition" in this comprehensive video lecture. Delve into the concept of supermasks, binary masks of randomly initialized neural networks that perform well on specific tasks, and their application in lifelong learning. Learn how the system can automatically derive task IDs at inference time and distinguish up to 2500 tasks. Follow along as the lecture covers key topics including catastrophic forgetting, mask superpositions, binary maximum entropy search, and encoding masks in Hopfield networks. Gain insights into the paper's methodology, experiments, and conclusions, as well as potential applications and extensions of this innovative approach to sequential learning in neural networks.
Syllabus
- Intro & Overview
- Catastrophic Forgetting
- Supermasks
- Lifelong Learning using Supermasks
- Inference Time Task Discrimination by Entropy
- Mask Superpositions
- Proof-of-Concept, Task Given at Inference
- Binary Maximum Entropy Search
- Task Not Given at Inference
- Task Not Given at Training
- Ablations
- Superfluous Neurons
- Task Selection by Detecting Outliers
- Encoding Masks in Hopfield Networks
- Conclusion
Taught by
Yannic Kilcher
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX