YoVDO

Optimisation

Offered By: Alfredo Canziani via YouTube

Tags

Optimization Algorithms Courses Deep Learning Courses Gradient Descent Courses Momentum Courses Stochastic Gradient Descent Courses

Course Description

Overview

Explore a comprehensive lecture on optimization techniques in deep learning, covering gradient descent, stochastic gradient descent (SGD), and momentum updates. Delve into adaptive methods like RMSprop and ADAM, and understand the impact of normalization layers on neural network training. Learn about the intuition behind these concepts, their performance comparisons, and their effects on convergence. Discover a real-world application of neural networks in accelerating MRI scans, demonstrating the practical implications of optimization in industry.

Syllabus

– Week 5 – Lecture
– Gradient Descent
– Stochastic Gradient Descent
– Momentum
– Adaptive Methods
– Normalization Layers
– The Death of Optimization


Taught by

Alfredo Canziani

Tags

Related Courses

Building Classification Models with scikit-learn
Pluralsight
Practical Deep Learning for Coders - Full Course
freeCodeCamp
Neural Networks Made Easy
Udemy
Intro to Deep Learning
Kaggle
Stochastic Gradient Descent
Great Learning via YouTube