Approximation Theory of Deep Learning from the Dynamical Viewpoint
Offered By: Fields Institute via YouTube
Course Description
Overview
Syllabus
Intro
Deep Learning: Theory vs Practice
Composition is Dynamics
Supervised Learning
The Problem of Approximation
Example: Approximation by Trigonometric Polynomials
The Continuum Idealization of Residual Networks
How do dynamics approximate functions?
Universal Approximation by Dynamics
Approximation of Symmetric Functions by Dynamical Hypothesis Spac
Sequence Modelling Applications
DL Architectures for Sequence Modelling
Modelling Static vs Dynamic Relationships
An Approximation Theory for Sequence Modelling
The Recurrent Neural Network Hypothesis Space
The Linear RNN Hypothesis Space
Properties of Linear RNN Hypothesis Space
Approximation Guarantee (Density)
Smoothness and Memory
Insights on the (Linear) RNN Hypothesis Space
Convolutional Architectures
Encoder-Decoder Architectures
Extending the RNN Analysis
Taught by
Fields Institute
Related Courses
Create Image Captioning Models - DeutschGoogle Cloud via Coursera Encoder-Decoder Architecture - Deutsch
Google Cloud via Coursera Advanced Chatbots with Deep Learning and Python
Packt via Coursera Machine Translation with Keras
DataCamp Natural Language Generation in Python
DataCamp