Training a Neural Network - Backward Propagation and Gradient Descent
Offered By: Valerio Velardo - The Sound of AI via YouTube
Course Description
Overview
Explore the theory and mathematics behind training neural networks through backpropagation and gradient descent in this 22-minute video. Delve into a high-level overview of the process, understand the roles of prediction and error wizards, examine the gradient of the error function, and analyze neural network elements. Learn about gradient descent and its application in optimizing neural networks. Access accompanying slides for visual aid and join The Sound of AI community for further discussions. Gain insights into hiring the presenter as a consultant or connect through various social media platforms for additional resources and networking opportunities.
Syllabus
Introduction
Highlevel overview
Prediction wizard
Error wizard
Gradient of error function
Neural network elements
Gradient descent
Taught by
Valerio Velardo - The Sound of AI
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX