YoVDO

Toward a Causal Analysis of Generalization in Deep Learning - Behnam Neyshabur

Offered By: Institute for Advanced Study via YouTube

Tags

Deep Learning Courses Experimental Design Courses

Course Description

Overview

Explore a thought-provoking conference talk on the causal analysis of generalization in deep learning, presented by Behnam Neyshabur from Google at the Workshop on Theory of Deep Learning: Where next? Delve into the intricacies of the generalization problem, examining concepts such as the generation gap, predicting generalization, and factors influencing generalization balance and correlation. Gain insights into the experimental design, including high-level design and overall rank correlation, as well as the results and hypotheses proposed. Investigate various measures and concepts, including canonical ordering, path norm, flatness-based measures, protection-based measures, optimization, negative correlation, and gradient noise. Conclude with a comprehensive summary of the key findings and their implications for the field of deep learning.

Syllabus

Intro
The Problem
Generation Gap
Predicting Generalization
What makes generalizations
Generalization Balance
Generalization Correlation
Experimental Design
HighLevel Design
Overall Rank Correlation
Results
Hypothesis
Canonical Ordering
Path Norm
Flatness Based Measures
Protection Based Measures
Optimization
Negative Correlation
Gradient Noise
Summary
Conclusion


Taught by

Institute for Advanced Study

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera
Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera
Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera
Leading Ambitious Teaching and Learning
Microsoft via edX