Flexible Neural Networks and the Frontiers of Meta-Learning
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the cutting-edge concepts of flexible neural networks and meta-learning in this 50-minute lecture by Chelsea Finn from Stanford University. Delve into the challenges of enabling agents to learn skills in the real world, focusing on few-shot image classification as a key example. Examine the meta-learning problem from both mechanistic and probabilistic perspectives, and understand how supervised learning relates to few-shot learning. Discover optimization-based inference techniques and learn how to leverage data from previous objects for quick adaptation to new ones. Gain insights into the practical implementation of FTML (Few-Shot Task-Agnostic Meta-Learning) and analyze experimental results in this thought-provoking talk from the Simons Institute's "Emerging Challenges in Deep Learning" series.
Syllabus
Intro
How can we enable agents to learn skills in the real world?
Example: Few-Shot Image Classification
The Meta-Learning Problem: The Mechanistic View
The Meta-Learning Problem: The Probabilistic View Supervised Learning
Few-Shot Learning
Optimization-Based Inference
Leverage data with previous objects to quickly adapt to new ones?
Practical instantiation of FTML
Experiments
Taught by
Simons Institute
Related Courses
Hiper-Memória & Hiper-AprendizagemUdemy Stanford CS330: Deep Multi-Task and Meta Learning
Stanford University via YouTube Stanford Seminar - The Next Generation of Robot Learning
Stanford University via YouTube Parameter Prediction for Unseen Deep Architectures - With First Author Boris Knyazev
Yannic Kilcher via YouTube Efficient and Modular Implicit Differentiation - Machine Learning Research Paper Explained
Yannic Kilcher via YouTube