YoVDO

Explained Gradient Descent Optimizer

Offered By: Code With Aarohi via YouTube

Tags

Gradient Descent Courses Algorithms and Data Structures Courses Deep Learning Courses Derivatives Courses

Course Description

Overview

Dive into a comprehensive explanation of the Gradient Descent Optimizer and Back Propagation Algorithm in this 26-minute video. Explore the mathematical foundations behind these crucial machine learning concepts, including the weight updation formula for Gradient Descent Optimizer and loss calculation. Gain insights into the most commonly used optimization technique in deep learning and machine learning, understanding how it calculates the first derivative to update weights and reach global minima. Learn about the convex function-based optimization algorithm used in training machine learning models, and how it iteratively tweaks parameters to minimize a given function to its local minimum. Cover key topics such as global minimum, weight, learning rate, weight updation, error, derivative, and related equations.

Syllabus

Introduction
Global Minimum
Weight
Learning Rate
Weight Updation
Error
Derivative
Equation


Taught by

Code With Aarohi

Related Courses

Calculus One
Ohio State University via Coursera
Matemáticas y Movimiento
Tecnológico de Monterrey via Coursera
Mathematical Methods for Quantitative Finance
University of Washington via Coursera
Çok değişkenli Fonksiyon I: Kavramlar / Multivariable Calculus I: Concepts
Koç University via Coursera
Preparing for the AP* Calculus AB and BC Exams
University of Houston System via Coursera