Recurrent Neural Networks and Transformers
Offered By: Alexander Amini via YouTube
Course Description
Overview
Explore the fundamentals of recurrent neural networks and transformers in this comprehensive lecture from MIT's Introduction to Deep Learning course. Delve into sequence modeling, neurons with recurrence, and the intuition behind RNNs. Learn how to unfold RNNs, build them from scratch, and understand their design criteria for sequential modeling. Examine word prediction examples and backpropagation through time, while addressing gradient issues. Discover long short-term memory (LSTM) and various RNN applications. Investigate attention mechanisms, their intuition, and relationship to search. Gain insights into learning attention with neural networks, scaling attention, and its applications. Conclude with a summary of key concepts in this 58-minute lecture delivered by Ava Soleimany, offering a solid foundation in advanced deep learning techniques.
Syllabus
- Introduction
- Sequence modeling
- Neurons with recurrence
- Recurrent neural networks
- RNN intuition
- Unfolding RNNs
- RNNs from scratch
- Design criteria for sequential modeling
- Word prediction example
- Backpropagation through time
- Gradient issues
- Long short term memory LSTM
- RNN applications
- Attention fundamentals
- Intuition of attention
- Attention and search relationship
- Learning attention with neural networks
- Scaling attention and applications
- Summary
Taught by
https://www.youtube.com/@AAmini/videos
Tags
Related Courses
Deep Learning for Natural Language ProcessingUniversity of Oxford via Independent Sequence Models
DeepLearning.AI via Coursera Deep Learning Part 1 (IITM)
Indian Institute of Technology Madras via Swayam Deep Learning - Part 1
Indian Institute of Technology, Ropar via Swayam Deep Learning - IIT Ropar
Indian Institute of Technology, Ropar via Swayam