IMAML- Meta-Learning with Implicit Gradients
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore a detailed video explanation of the iMAML (Implicit Model-Agnostic Meta-Learning) paper, which presents an innovative approach to gradient-based meta-learning. Learn how this method circumvents the computational challenges of full backpropagation through inner optimization procedures by cleverly introducing a quadratic regularizer. Dive into key concepts including meta-learning fundamentals, the differences between MAML and iMAML, problem formulation, proximal regularization, and the derivation of implicit gradients. Gain insights into the intuition behind this approach, understand the full algorithm, and examine experimental results. This comprehensive breakdown covers the paper's abstract, authors, and provides links to additional resources for further study.
Syllabus
- Intro
- What is Meta-Learning?
- MAML vs iMAML
- Problem Formulation
- Proximal Regularization
- Derivation of the Implicit Gradient
- Intuition why this works
- Full Algorithm
- Experiments
Taught by
Yannic Kilcher
Related Courses
Stanford Seminar - Enabling NLP, Machine Learning, and Few-Shot Learning Using Associative ProcessingStanford University via YouTube GUI-Based Few Shot Classification Model Trainer - Demo
James Briggs via YouTube HyperTransformer - Model Generation for Supervised and Semi-Supervised Few-Shot Learning
Yannic Kilcher via YouTube GPT-3 - Language Models Are Few-Shot Learners
Yannic Kilcher via YouTube Few-Shot Learning in Production
HuggingFace via YouTube