YoVDO

Building Makemore - Becoming a Backprop Ninja

Offered By: Andrej Karpathy via YouTube

Tags

Natural Language Processing (NLP) Courses Deep Learning Courses Neural Networks Courses Batch Normalization Courses Backpropagation Courses

Course Description

Overview

Dive deep into the intricacies of backpropagation in neural networks with this comprehensive video tutorial. Explore the manual backpropagation process through a 2-layer MLP (with BatchNorm) without relying on PyTorch autograd's loss.backward(). Gain a strong intuitive understanding of gradient flow through the compute graph, covering cross entropy loss, linear layers, tanh activation, batch normalization, and embedding tables. Build competence and intuition around neural network optimization, setting the foundation for confidently innovating and debugging modern neural networks. Engage with hands-on exercises, supplemented by provided code and resources, to reinforce your learning. Discover the historical context and importance of understanding backpropagation while working through practical examples and gaining insights into concepts like Bessel's correction in batch normalization.

Syllabus

intro: why you should care & fun history
starter code
exercise 1: backproping the atomic compute graph
brief digression: bessel’s correction in batchnorm
exercise 2: cross entropy loss backward pass
exercise 3: batch norm layer backward pass
exercise 4: putting it all together
outro


Taught by

Andrej Karpathy

Related Courses

Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization
DeepLearning.AI via Coursera
Deep Learning: Convolutional Neural Networks in Python
Udemy
Deep Learning with PyTorch
DataCamp
تعزيز الشبكات العصبية : ضبط وتحسين مقياس فرط المعلمات
DeepLearning.AI via Coursera
Повышение эффективности глубоких нейросетей
DeepLearning.AI via Coursera