Can Learning Theory Resist Deep Learning? - Francis Bach, INRIA
Offered By: Alan Turing Institute via YouTube
Course Description
Overview
Syllabus
Intro
Scientific context
Parametric supervised machine learning
Convex optimization problems
Exponentially convergent SGD for smooth finite sums
Exponentially convergent SGD for finite sums
Convex optimization for machine learning
Theoretical analysis of deep learning
Optimization for multi-layer neural networks
Gradient descent for a single hidden layer
Optimization on measures
Many particle limit and global convergence (Chizat and Bach, 2018a)
Simple simulations with neural networks
From qualitative to quantitative results ?
Lazy training (Chizat and Bach, 2018)
From lazy training to neural tangent kernel
Are state-of-the-art neural networks in the lazy regime?
Is the neural tangent kernel useful in practice?
Can learning theory resist deep learning?
Taught by
Alan Turing Institute
Related Courses
Practical Predictive Analytics: Models and MethodsUniversity of Washington via Coursera Deep Learning Fundamentals with Keras
IBM via edX Introduction to Machine Learning
Duke University via Coursera Intro to Deep Learning with PyTorch
Facebook via Udacity Introduction to Machine Learning for Coders!
fast.ai via Independent