Pytorch Transformers from Scratch - Attention Is All You Need
Offered By: Aladdin Persson via YouTube
Course Description
Overview
Dive into a comprehensive 57-minute video tutorial on implementing PyTorch Transformers from scratch, based on the groundbreaking "Attention is all you need" paper. Explore the original transformer architecture, starting with a detailed paper review and progressing through key components such as the attention mechanism, transformer blocks, encoder, and decoder. Learn how to assemble these elements to create a complete Transformer model, and gain practical insights through a small example and error-fixing session. Benefit from additional resources, including recommended courses and free materials, to further enhance your understanding of machine learning, deep learning, and natural language processing.
Syllabus
- Introduction
- Paper Review
- Attention Mechanism
- TransformerBlock
- Encoder
- DecoderBlock
- Decoder
- Putting it togethor to form The Transformer
- A Small Example
- Fixing Errors
- Ending
Taught by
Aladdin Persson
Related Courses
Sequence ModelsDeepLearning.AI via Coursera Modern Natural Language Processing in Python
Udemy Stanford Seminar - Transformers in Language: The Development of GPT Models Including GPT-3
Stanford University via YouTube Long Form Question Answering in Haystack
James Briggs via YouTube Spotify's Podcast Search Explained
James Briggs via YouTube