YoVDO

Gradients Are Not All You Need - Machine Learning Research Paper Explained

Offered By: Yannic Kilcher via YouTube

Tags

Machine Learning Courses Optimization Algorithms Courses Backpropagation Courses Differentiable Programming Courses

Course Description

Overview

Explore the limitations of differentiable programming techniques in machine learning through this in-depth video analysis. Delve into the chaos-based failure mode that affects various differentiable systems, from recurrent neural networks to numerical physics simulations. Examine the connection between this failure and the spectrum of the Jacobian, and learn criteria for predicting when differentiation-based optimization algorithms might falter. Investigate examples in policy learning, meta-learning optimizers, and disk packing to understand the practical implications. Discover potential solutions and consider the advantages of black-box methods in overcoming these challenges.

Syllabus

- Foreword
- Intro & Overview
- Backpropagation through iterated systems
- Connection to the spectrum of the Jacobian
- The Reparameterization Trick
- Problems of reparameterization
- Example 1: Policy Learning in Simulation
- Example 2: Meta-Learning Optimizers
- Example 3: Disk packing
- Analysis of Jacobians
- What can be done?
- Just use Black-Box methods


Taught by

Yannic Kilcher

Related Courses

Swift for TensorFlow - Google I/O 2019
TensorFlow via YouTube
A Breakthrough for Natural Language - Ben Vigoda - ODSC East 2018
Open Data Science via YouTube
How Hard Is It to Train Variational Quantum Circuits?
Simons Institute via YouTube
Learning From Ranks, Learning to Rank - Jean-Philippe Vert, Google Brain
Alan Turing Institute via YouTube
Tropical Tensor Networks
Institute for Pure & Applied Mathematics (IPAM) via YouTube