Programming LSTM with Keras and TensorFlow
Offered By: Jeff Heaton via YouTube
Course Description
Overview
Explore Long Short Term Memory (LSTM) and Gated Recurrent Units (GRU) in this 28-minute video tutorial. Learn how these layer types are used to build recurrent neural networks in Keras, laying the groundwork for Natural Language Processing (NLP) and time series prediction. Dive into LSTM diagrams, internals, and sigmoid functions before comparing LSTM to GRU. Follow along with practical examples, including an LSTM implementation and a sunspots prediction model. Access accompanying code on GitHub and discover additional resources to further your deep learning journey.
Syllabus
Introduction
Context Neurons
LSTM Diagram
LSTM Internals
Sigmoid Functions
GRU
LSTM
LSTM vs GRU
LSTM Example 1
Training Results
Sunspots
Training Data
SyncWences
Data Structure
Train Model
Results
Outro
Taught by
Jeff Heaton
Related Courses
Reinforcement Learning for Trading StrategiesNew York Institute of Finance via Coursera Natural Language Processing with Sequence Models
DeepLearning.AI via Coursera Fake News Detection with Machine Learning
Coursera Project Network via Coursera English/French Translator: Long Short Term Memory Networks
Coursera Project Network via Coursera Text Classification Using Word2Vec and LSTM on Keras
Coursera Project Network via Coursera