YoVDO

Training a Neural Network - Implementing Backpropagation and Gradient Descent from Scratch

Offered By: Valerio Velardo - The Sound of AI via YouTube

Tags

Backpropagation Courses Machine Learning Courses Deep Learning Courses Neural Networks Courses Gradient Descent Courses Derivatives Courses

Course Description

Overview

Dive into a comprehensive video tutorial on implementing backpropagation and gradient descent from scratch using Python. Learn how to train a neural network to perform arithmetic sum operations. Explore key concepts including data representation, derivatives, reshaping, and the creation of a Natural Language Processing (NLP) model. Follow along as the instructor demonstrates the implementation of backpropagation, testing procedures, and the application of gradient descent. Gain hands-on experience in training a Multilayer Perceptron (MLP) and understand the intricacies of neural network training. Access the accompanying code on GitHub for further practice and experimentation.

Syllabus

Introduction
Data Representation
Derivatives
Reshape
Back propagation
Creating an NLP
Implementing backpropagation
Testing backpropagation
Implementing gradient descent
Applying gradient descent
Printing weights
Testing
Gradient Descent
Train
Train MLP


Taught by

Valerio Velardo - The Sound of AI

Related Courses

Deep Learning Fundamentals with Keras
IBM via edX
Deep Learning Essentials
Université de Montréal via edX
Deep Learning with TensorFlow 2.0
Udemy
Data Science: Deep Learning and Neural Networks in Python
Udemy
Нейронные сети и глубокое обучение
DeepLearning.AI via Coursera