YoVDO

Stochastic Gradient Descent and Backpropagation

Offered By: Alfredo Canziani via YouTube

Tags

Backpropagation Courses Neural Networks Courses PyTorch Courses Loss Functions Courses Stochastic Gradient Descent Courses

Course Description

Overview

Dive into a comprehensive lecture on stochastic gradient descent and backpropagation, exploring parameterized models, loss functions, and gradient-based methods. Learn how to implement neural networks in PyTorch and understand the generalized form of backpropagation. Examine concrete examples, discuss Jacobian matrix dimensions, and explore various neural net modules while computing their gradients. Gain insights into softmax, logsoftmax, and practical tricks for backpropagation to enhance your understanding of deep learning concepts and techniques.

Syllabus

– Week 2 – Lecture
– Gradient Descent Optimization Algorithm
– Advantages of SGD, Backpropagation for Traditional Neural Net
– PyTorch implementation of Neural Network and a Generalized Backprop Algorithm
– Basic Modules - LogSoftMax
– Practical Tricks for Backpropagation
– Computing gradients for NN modules and Practical tricks for Back Propagation


Taught by

Alfredo Canziani

Tags

Related Courses

Building Classification Models with scikit-learn
Pluralsight
Practical Deep Learning for Coders - Full Course
freeCodeCamp
Neural Networks Made Easy
Udemy
Intro to Deep Learning
Kaggle
Stochastic Gradient Descent
Great Learning via YouTube