Recurrent Neural Networks - Full Stack Deep Learning - Spring 2021
Offered By: The Full Stack via YouTube
Course Description
Overview
Dive deep into Recurrent Neural Networks (RNNs) in this comprehensive lecture from the Full Stack Deep Learning Spring 2021 series. Explore sequence problems before delving into the RNN architecture, addressing its challenges and solutions. Examine a case study on Machine Translation at Google, and learn about the CTC loss function crucial for lab work. Analyze the advantages and disadvantages of RNNs, and get a preview of non-recurrent sequence models. Topics covered include sequence problems, RNN review, vanishing gradient issues, LSTMs and variants, bidirectionality and attention in Google's Neural Machine Translation, CTC loss, pros and cons of encoder-decoder LSTM architectures, and an introduction to WaveNet.
Syllabus
- Introduction
- Sequence Problems
- Review of RNNs
- Vanishing Gradient Issue
- LSTMs and Its Variants
- Bidirectionality and Attention from Google's Neural Machine Translation
- CTC Loss
- Pros and Cons of Encoder-Decoder LSTM Architectures
- WaveNet
Taught by
The Full Stack
Related Courses
Natural Language Generation in PythonDataCamp Machine Translation with Keras
DataCamp Pytorch Transformers from Scratch - Attention Is All You Need
Aladdin Persson via YouTube Pytorch Seq2Seq Tutorial for Machine Translation
Aladdin Persson via YouTube Region Mutual Information Loss for Semantic Segmentation
University of Central Florida via YouTube