YoVDO

Stochastic Gradient Descent and Backpropagation

Offered By: Alfredo Canziani via YouTube

Tags

Backpropagation Courses Neural Networks Courses PyTorch Courses Loss Functions Courses Stochastic Gradient Descent Courses

Course Description

Overview

Dive into a comprehensive lecture on stochastic gradient descent and backpropagation, exploring parameterized models, loss functions, and gradient-based methods. Learn how to implement neural networks in PyTorch and understand the generalized form of backpropagation. Examine concrete examples, discuss Jacobian matrix dimensions, and explore various neural net modules while computing their gradients. Gain insights into softmax, logsoftmax, and practical tricks for backpropagation to enhance your understanding of deep learning concepts and techniques.

Syllabus

– Week 2 – Lecture
– Gradient Descent Optimization Algorithm
– Advantages of SGD, Backpropagation for Traditional Neural Net
– PyTorch implementation of Neural Network and a Generalized Backprop Algorithm
– Basic Modules - LogSoftMax
– Practical Tricks for Backpropagation
– Computing gradients for NN modules and Practical tricks for Back Propagation


Taught by

Alfredo Canziani

Tags

Related Courses

Launching into Machine Learning auf Deutsch
Google Cloud via Coursera
Статистические методы анализа данных
Higher School of Economics via Coursera
Linear Classifiers in Python
DataCamp
The Complete Neural Networks Bootcamp: Theory, Applications
Udemy
Gradient Boost
YouTube