Can Learning Theory Resist Deep Learning? - Francis Bach, INRIA
Offered By: Alan Turing Institute via YouTube
Course Description
Overview
Syllabus
Intro
Scientific context
Parametric supervised machine learning
Convex optimization problems
Exponentially convergent SGD for smooth finite sums
Exponentially convergent SGD for finite sums
Convex optimization for machine learning
Theoretical analysis of deep learning
Optimization for multi-layer neural networks
Gradient descent for a single hidden layer
Optimization on measures
Many particle limit and global convergence (Chizat and Bach, 2018a)
Simple simulations with neural networks
From qualitative to quantitative results ?
Lazy training (Chizat and Bach, 2018)
From lazy training to neural tangent kernel
Are state-of-the-art neural networks in the lazy regime?
Is the neural tangent kernel useful in practice?
Can learning theory resist deep learning?
Taught by
Alan Turing Institute
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX