YoVDO

Recurrent Neural Networks - Full Stack Deep Learning - Spring 2021

Offered By: The Full Stack via YouTube

Tags

Deep Learning Courses Long short-term memory (LSTM) Courses Attention Mechanisms Courses Encoder-Decoder Architecture Courses Sequence Modeling Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive deep into Recurrent Neural Networks (RNNs) in this comprehensive lecture from the Full Stack Deep Learning Spring 2021 series. Explore sequence problems before delving into the RNN architecture, addressing its challenges and solutions. Examine a case study on Machine Translation at Google, and learn about the CTC loss function crucial for lab work. Analyze the advantages and disadvantages of RNNs, and get a preview of non-recurrent sequence models. Topics covered include sequence problems, RNN review, vanishing gradient issues, LSTMs and variants, bidirectionality and attention in Google's Neural Machine Translation, CTC loss, pros and cons of encoder-decoder LSTM architectures, and an introduction to WaveNet.

Syllabus

- Introduction
- Sequence Problems
- Review of RNNs
- Vanishing Gradient Issue
- LSTMs and Its Variants
- Bidirectionality and Attention from Google's Neural Machine Translation
- CTC Loss
- Pros and Cons of Encoder-Decoder LSTM Architectures
- WaveNet


Taught by

The Full Stack

Related Courses

Reinforcement Learning for Trading Strategies
New York Institute of Finance via Coursera
Natural Language Processing with Sequence Models
DeepLearning.AI via Coursera
Fake News Detection with Machine Learning
Coursera Project Network via Coursera
English/French Translator: Long Short Term Memory Networks
Coursera Project Network via Coursera
Text Classification Using Word2Vec and LSTM on Keras
Coursera Project Network via Coursera