YoVDO

Gradient Descent and the Backpropagation Algorithm

Offered By: Alfredo Canziani via YouTube

Tags

Gradient Descent Courses Supervised Learning Courses Neural Networks Courses PyTorch Courses Loss Functions Courses

Course Description

Overview

Dive into a comprehensive lecture on gradient descent and the backpropagation algorithm, delivered by renowned speaker Yann LeCun. Explore key concepts in supervised learning, parametrized models, and loss functions before delving into the intricacies of gradient descent. Gain insights into traditional neural networks and learn how backpropagation works through non-linear functions and weighted sums. Follow along with a PyTorch implementation and discover practical applications of backpropagation. Investigate the process of learning representations and understand why shallow networks are considered universal approximators. Conclude by examining the relationship between multilayer architectures and the compositional structure of data in this in-depth, 1-hour and 51-minute exploration of fundamental machine learning concepts.

Syllabus

– Supervised learning
– Parametrised models
– Block diagram
– Loss function, average loss
– Gradient descent
– Traditional neural nets
– Backprop through a non-linear function
– Backprop through a weighted sum
– PyTorch implementation
– Backprop through a functional module
– Backprop through a functional module
– Backprop in practice
– Learning representations
– Shallow networks are universal approximators!
– Multilayer architectures == compositional structure of data


Taught by

Alfredo Canziani

Tags

Related Courses

Launching into Machine Learning auf Deutsch
Google Cloud via Coursera
Статистические методы анализа данных
Higher School of Economics via Coursera
Linear Classifiers in Python
DataCamp
The Complete Neural Networks Bootcamp: Theory, Applications
Udemy
Gradient Boost
YouTube