Approximation Theory of Deep Learning from the Dynamical Viewpoint
Offered By: Fields Institute via YouTube
Course Description
Overview
Syllabus
Intro
Deep Learning: Theory vs Practice
Composition is Dynamics
Supervised Learning
The Problem of Approximation
Example: Approximation by Trigonometric Polynomials
The Continuum Idealization of Residual Networks
How do dynamics approximate functions?
Universal Approximation by Dynamics
Approximation of Symmetric Functions by Dynamical Hypothesis Spac
Sequence Modelling Applications
DL Architectures for Sequence Modelling
Modelling Static vs Dynamic Relationships
An Approximation Theory for Sequence Modelling
The Recurrent Neural Network Hypothesis Space
The Linear RNN Hypothesis Space
Properties of Linear RNN Hypothesis Space
Approximation Guarantee (Density)
Smoothness and Memory
Insights on the (Linear) RNN Hypothesis Space
Convolutional Architectures
Encoder-Decoder Architectures
Extending the RNN Analysis
Taught by
Fields Institute
Related Courses
Natural Language Generation in PythonDataCamp Machine Translation with Keras
DataCamp Pytorch Transformers from Scratch - Attention Is All You Need
Aladdin Persson via YouTube Pytorch Seq2Seq Tutorial for Machine Translation
Aladdin Persson via YouTube Region Mutual Information Loss for Semantic Segmentation
University of Central Florida via YouTube