YoVDO

Long Short-Term Memory (LSTM), Clearly Explained

Offered By: StatQuest with Josh Starmer via YouTube

Tags

Long short-term memory (LSTM) Courses Data Science Courses Deep Learning Courses Neural Networks Courses

Course Description

Overview

Explore the intricacies of Long Short-Term Memory (LSTM) networks in this 21-minute video tutorial. Discover how LSTMs overcome the limitations of basic recurrent neural networks by effectively handling larger sequences of data without encountering gradient problems. Learn about the sigmoid and tanh activation functions, and delve into the three stages of LSTM: determining the percent to remember, updating long-term memory, and updating short-term memory. Conclude with a practical demonstration of LSTM in action using real data. The video is available with artificial voice dubbing in Spanish and Portuguese for increased accessibility.

Syllabus

Awesome song, introduction and main ideas
The sigmoid and tanh activation functions
LSTM Stage 1: The percent to remember
LSTM Stage 2: Update the long-term memory
LSTM Stage 3:Update the short-term memory
LSTM in action with real data


Taught by

StatQuest with Josh Starmer

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera
Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera
Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera
Leading Ambitious Teaching and Learning
Microsoft via edX