Recurrent Neural Networks - Full Stack Deep Learning - Spring 2021
Offered By: The Full Stack via YouTube
Course Description
Overview
Dive deep into Recurrent Neural Networks (RNNs) in this comprehensive lecture from the Full Stack Deep Learning Spring 2021 series. Explore sequence problems before delving into the RNN architecture, addressing its challenges and solutions. Examine a case study on Machine Translation at Google, and learn about the CTC loss function crucial for lab work. Analyze the advantages and disadvantages of RNNs, and get a preview of non-recurrent sequence models. Topics covered include sequence problems, RNN review, vanishing gradient issues, LSTMs and variants, bidirectionality and attention in Google's Neural Machine Translation, CTC loss, pros and cons of encoder-decoder LSTM architectures, and an introduction to WaveNet.
Syllabus
- Introduction
- Sequence Problems
- Review of RNNs
- Vanishing Gradient Issue
- LSTMs and Its Variants
- Bidirectionality and Attention from Google's Neural Machine Translation
- CTC Loss
- Pros and Cons of Encoder-Decoder LSTM Architectures
- WaveNet
Taught by
The Full Stack
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera 機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera Leading Ambitious Teaching and Learning
Microsoft via edX