YoVDO

Gradients Are Not All You Need - Machine Learning Research Paper Explained

Offered By: Yannic Kilcher via YouTube

Tags

Machine Learning Courses Optimization Algorithms Courses Backpropagation Courses Differentiable Programming Courses

Course Description

Overview

Explore the limitations of differentiable programming techniques in machine learning through this in-depth video analysis. Delve into the chaos-based failure mode that affects various differentiable systems, from recurrent neural networks to numerical physics simulations. Examine the connection between this failure and the spectrum of the Jacobian, and learn criteria for predicting when differentiation-based optimization algorithms might falter. Investigate examples in policy learning, meta-learning optimizers, and disk packing to understand the practical implications. Discover potential solutions and consider the advantages of black-box methods in overcoming these challenges.

Syllabus

- Foreword
- Intro & Overview
- Backpropagation through iterated systems
- Connection to the spectrum of the Jacobian
- The Reparameterization Trick
- Problems of reparameterization
- Example 1: Policy Learning in Simulation
- Example 2: Meta-Learning Optimizers
- Example 3: Disk packing
- Analysis of Jacobians
- What can be done?
- Just use Black-Box methods


Taught by

Yannic Kilcher

Related Courses

Deep Learning for Natural Language Processing
University of Oxford via Independent
Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization
DeepLearning.AI via Coursera
Deep Learning Part 1 (IITM)
Indian Institute of Technology Madras via Swayam
Deep Learning - Part 1
Indian Institute of Technology, Ropar via Swayam
Logistic Regression with Python and Numpy
Coursera Project Network via Coursera