YoVDO

Gradients Are Not All You Need - Machine Learning Research Paper Explained

Offered By: Yannic Kilcher via YouTube

Tags

Machine Learning Courses Optimization Algorithms Courses Backpropagation Courses Differentiable Programming Courses

Course Description

Overview

Explore the limitations of differentiable programming techniques in machine learning through this in-depth video analysis. Delve into the chaos-based failure mode that affects various differentiable systems, from recurrent neural networks to numerical physics simulations. Examine the connection between this failure and the spectrum of the Jacobian, and learn criteria for predicting when differentiation-based optimization algorithms might falter. Investigate examples in policy learning, meta-learning optimizers, and disk packing to understand the practical implications. Discover potential solutions and consider the advantages of black-box methods in overcoming these challenges.

Syllabus

- Foreword
- Intro & Overview
- Backpropagation through iterated systems
- Connection to the spectrum of the Jacobian
- The Reparameterization Trick
- Problems of reparameterization
- Example 1: Policy Learning in Simulation
- Example 2: Meta-Learning Optimizers
- Example 3: Disk packing
- Analysis of Jacobians
- What can be done?
- Just use Black-Box methods


Taught by

Yannic Kilcher

Related Courses

Deep Learning Fundamentals with Keras
IBM via edX
Deep Learning Essentials
Université de Montréal via edX
Deep Learning with TensorFlow 2.0
Udemy
Data Science: Deep Learning and Neural Networks in Python
Udemy
Нейронные сети и глубокое обучение
DeepLearning.AI via Coursera