YoVDO

Predictive Coding Approximates Backprop Along Arbitrary Computation Graphs - Paper Explained

Offered By: Yannic Kilcher via YouTube

Tags

Neuroscience Courses Artificial Intelligence Courses Machine Learning Courses Convolutional Neural Networks (CNN) Courses Long short-term memory (LSTM) Courses Backpropagation Courses

Course Description

Overview

Explore a comprehensive video analysis of the paper "Predictive Coding Approximates Backprop along Arbitrary Computation Graphs." Delve into the groundbreaking research that bridges the gap between neuroscience and machine learning by demonstrating how Predictive Coding, a biologically plausible algorithm, can approximate Backpropagation for any computation graph. Learn about the experimental verification of building and training CNNs and LSTMs using Predictive Coding, and understand the implications for the similarities between brain function and deep neural networks. Follow along as the video breaks down complex concepts, provides pseudocode explanations, and offers a code walkthrough to illustrate the practical applications of this research.

Syllabus

- Intro & Overview
- Backpropagation & Biology
- Experimental Results
- Predictive Coding
- Pseudocode
- Predictive Coding approximates Backprop
- Hebbian Updates
- Code Walkthrough
- Conclusion & Comments


Taught by

Yannic Kilcher

Related Courses

Convolutional Neural Networks
DeepLearning.AI via Coursera
Convolutional Neural Networks in TensorFlow
DeepLearning.AI via Coursera
TensorFlow for CNNs: Transfer Learning
Coursera Project Network via Coursera
Visualizing Filters of a CNN using TensorFlow
Coursera Project Network via Coursera
Fine-tuning Convolutional Networks to Classify Dog Breeds
Coursera Project Network via Coursera