Explained Gradient Descent Optimizer
Offered By: Code With Aarohi via YouTube
Course Description
Overview
Dive into a comprehensive explanation of the Gradient Descent Optimizer and Back Propagation Algorithm in this 26-minute video. Explore the mathematical foundations behind these crucial machine learning concepts, including the weight updation formula for Gradient Descent Optimizer and loss calculation. Gain insights into the most commonly used optimization technique in deep learning and machine learning, understanding how it calculates the first derivative to update weights and reach global minima. Learn about the convex function-based optimization algorithm used in training machine learning models, and how it iteratively tweaks parameters to minimize a given function to its local minimum. Cover key topics such as global minimum, weight, learning rate, weight updation, error, derivative, and related equations.
Syllabus
Introduction
Global Minimum
Weight
Learning Rate
Weight Updation
Error
Derivative
Equation
Taught by
Code With Aarohi
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera 機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera Leading Ambitious Teaching and Learning
Microsoft via edX