Long Short-Term Memory (LSTM), Clearly Explained
Offered By: StatQuest with Josh Starmer via YouTube
Course Description
Overview
Explore the intricacies of Long Short-Term Memory (LSTM) networks in this 21-minute video tutorial. Discover how LSTMs overcome the limitations of basic recurrent neural networks by effectively handling larger sequences of data without encountering gradient problems. Learn about the sigmoid and tanh activation functions, and delve into the three stages of LSTM: determining the percent to remember, updating long-term memory, and updating short-term memory. Conclude with a practical demonstration of LSTM in action using real data. The video is available with artificial voice dubbing in Spanish and Portuguese for increased accessibility.
Syllabus
Awesome song, introduction and main ideas
The sigmoid and tanh activation functions
LSTM Stage 1: The percent to remember
LSTM Stage 2: Update the long-term memory
LSTM Stage 3:Update the short-term memory
LSTM in action with real data
Taught by
StatQuest with Josh Starmer
Related Courses
Data AnalysisJohns Hopkins University via Coursera Computing for Data Analysis
Johns Hopkins University via Coursera Scientific Computing
University of Washington via Coursera Introduction to Data Science
University of Washington via Coursera Web Intelligence and Big Data
Indian Institute of Technology Delhi via Coursera