YoVDO

Long Short-Term Memory with PyTorch + Lightning

Offered By: StatQuest with Josh Starmer via YouTube

Tags

Long short-term memory (LSTM) Courses Deep Learning Courses PyTorch Courses Tensorboard Courses

Course Description

Overview

Learn how to implement and train Long Short-Term Memory (LSTM) networks using PyTorch and Lightning in this comprehensive 33-minute tutorial. Code an LSTM unit from scratch, then utilize PyTorch's nn.LSTM() function for comparison. Discover Lightning's powerful features, including adding training epochs without restarting and easily visualizing training results. Explore key concepts such as importing modules, creating LSTM classes, initializing tensors, performing LSTM calculations, configuring optimizers, and calculating loss. Gain hands-on experience in training both custom-built and PyTorch-provided LSTM models, and learn to evaluate training progress using TensorBoard. Perfect for those looking to deepen their understanding of LSTM implementation and training techniques in PyTorch and Lightning.

Syllabus

Awesome song and introduction
Importing the modules
An outline of an LSTM class
init: Creating and initializing the tensors
lstm_unit: Doing the LSTM math
forward: Make a forward pass through an unrolled LSTM
configure_optimizers: Configure the...optimizers.
training_step: Calculate the loss and log progress
Using and training our homemade LSTM
Evaluating training with TensorBoard
Adding more epochs to training
Using and training PyTorch's nn.lstm


Taught by

StatQuest with Josh Starmer

Related Courses

Practical Machine Learning with Tensorflow
Google via Swayam
Advanced Deployment Scenarios with TensorFlow
DeepLearning.AI via Coursera
TensorFlow: Data and Deployment
DeepLearning.AI via Coursera
Introduction to TensorFlow in R
DataCamp
Building and Deploying Deep Learning Applications with TensorFlow
LinkedIn Learning