YoVDO

Predictive Coding Approximates Backprop Along Arbitrary Computation Graphs - Paper Explained

Offered By: Yannic Kilcher via YouTube

Tags

Neuroscience Courses Artificial Intelligence Courses Machine Learning Courses Convolutional Neural Networks (CNN) Courses Long short-term memory (LSTM) Courses Backpropagation Courses

Course Description

Overview

Explore a comprehensive video analysis of the paper "Predictive Coding Approximates Backprop along Arbitrary Computation Graphs." Delve into the groundbreaking research that bridges the gap between neuroscience and machine learning by demonstrating how Predictive Coding, a biologically plausible algorithm, can approximate Backpropagation for any computation graph. Learn about the experimental verification of building and training CNNs and LSTMs using Predictive Coding, and understand the implications for the similarities between brain function and deep neural networks. Follow along as the video breaks down complex concepts, provides pseudocode explanations, and offers a code walkthrough to illustrate the practical applications of this research.

Syllabus

- Intro & Overview
- Backpropagation & Biology
- Experimental Results
- Predictive Coding
- Pseudocode
- Predictive Coding approximates Backprop
- Hebbian Updates
- Code Walkthrough
- Conclusion & Comments


Taught by

Yannic Kilcher

Related Courses

Basic Behavioral Neurology
University of Pennsylvania via Coursera
Neuroethics
University of Pennsylvania via Coursera
Medical Neuroscience
Duke University via Coursera
Drugs and the Brain
California Institute of Technology via Coursera
Computational Neuroscience
University of Washington via Coursera