On the Connection Between Neural Networks and Kernels: A Modern Perspective - Simon Du
Offered By: Institute for Advanced Study via YouTube
Course Description
Overview
Explore a comprehensive lecture on the modern perspective of the connection between neural networks and kernels. Delve into fundamental questions, empirical observations on training loss and generalization, and over-parameterization in neural networks. Examine trajectory-based analysis, kernel matrices, and the main theory behind zero training error. Investigate empirical results on generalization, convolutional neural tangent kernels, and their application to CIFAR-10. Understand global and local average pooling, and explore experiments on UCI datasets and few-shot learning. Discover graph neural tangent kernels for graph classification, and gain insights into the latest research and references in this field.
Syllabus
Intro
Two Fundamental Questions
Empirical Observations on Training Loss
Over-parameterization
Empirical Observations on Generalization
Example: Two-layer NN
Trajectory-based Analysis
The Trajectory of Predictions (Cont'd)
Kernel Matrix at the Beginning
Kernel Matrix During Training
Main Theory
Zero Training Error
Empirical Results on Generalization
Convolutional Neural Tangent Kernel
CNTK on CIFAR 10
Understanding Global Average Pooling
Local Average Pooling
UCI Experiment Setup
UCI Results
Few-shot Learning Setup
Few-shot Learning Results
Graph NTK for Graph Classification
Summary
References
Taught by
Institute for Advanced Study
Related Courses
Stanford Seminar - Enabling NLP, Machine Learning, and Few-Shot Learning Using Associative ProcessingStanford University via YouTube GUI-Based Few Shot Classification Model Trainer - Demo
James Briggs via YouTube HyperTransformer - Model Generation for Supervised and Semi-Supervised Few-Shot Learning
Yannic Kilcher via YouTube GPT-3 - Language Models Are Few-Shot Learners
Yannic Kilcher via YouTube IMAML- Meta-Learning with Implicit Gradients
Yannic Kilcher via YouTube