IMAML- Meta-Learning with Implicit Gradients
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore a detailed video explanation of the iMAML (Implicit Model-Agnostic Meta-Learning) paper, which presents an innovative approach to gradient-based meta-learning. Learn how this method circumvents the computational challenges of full backpropagation through inner optimization procedures by cleverly introducing a quadratic regularizer. Dive into key concepts including meta-learning fundamentals, the differences between MAML and iMAML, problem formulation, proximal regularization, and the derivation of implicit gradients. Gain insights into the intuition behind this approach, understand the full algorithm, and examine experimental results. This comprehensive breakdown covers the paper's abstract, authors, and provides links to additional resources for further study.
Syllabus
- Intro
- What is Meta-Learning?
- MAML vs iMAML
- Problem Formulation
- Proximal Regularization
- Derivation of the Implicit Gradient
- Intuition why this works
- Full Algorithm
- Experiments
Taught by
Yannic Kilcher
Related Courses
Hiper-Memória & Hiper-AprendizagemUdemy Stanford CS330: Deep Multi-Task and Meta Learning
Stanford University via YouTube Stanford Seminar - The Next Generation of Robot Learning
Stanford University via YouTube Parameter Prediction for Unseen Deep Architectures - With First Author Boris Knyazev
Yannic Kilcher via YouTube Efficient and Modular Implicit Differentiation - Machine Learning Research Paper Explained
Yannic Kilcher via YouTube