YoVDO

Predictive Coding Approximates Backprop Along Arbitrary Computation Graphs - Paper Explained

Offered By: Yannic Kilcher via YouTube

Tags

Neuroscience Courses Artificial Intelligence Courses Machine Learning Courses Convolutional Neural Networks (CNN) Courses Long short-term memory (LSTM) Courses Backpropagation Courses

Course Description

Overview

Explore a comprehensive video analysis of the paper "Predictive Coding Approximates Backprop along Arbitrary Computation Graphs." Delve into the groundbreaking research that bridges the gap between neuroscience and machine learning by demonstrating how Predictive Coding, a biologically plausible algorithm, can approximate Backpropagation for any computation graph. Learn about the experimental verification of building and training CNNs and LSTMs using Predictive Coding, and understand the implications for the similarities between brain function and deep neural networks. Follow along as the video breaks down complex concepts, provides pseudocode explanations, and offers a code walkthrough to illustrate the practical applications of this research.

Syllabus

- Intro & Overview
- Backpropagation & Biology
- Experimental Results
- Predictive Coding
- Pseudocode
- Predictive Coding approximates Backprop
- Hebbian Updates
- Code Walkthrough
- Conclusion & Comments


Taught by

Yannic Kilcher

Related Courses

Reinforcement Learning for Trading Strategies
New York Institute of Finance via Coursera
Natural Language Processing with Sequence Models
DeepLearning.AI via Coursera
Fake News Detection with Machine Learning
Coursera Project Network via Coursera
English/French Translator: Long Short Term Memory Networks
Coursera Project Network via Coursera
Text Classification Using Word2Vec and LSTM on Keras
Coursera Project Network via Coursera