YoVDO

Pytorch Transformers from Scratch - Attention Is All You Need

Offered By: Aladdin Persson via YouTube

Tags

PyTorch Courses Machine Learning Courses Deep Learning Courses Attention Mechanisms Courses Transformer Models Courses Encoder-Decoder Architecture Courses

Course Description

Overview

Dive into a comprehensive 57-minute video tutorial on implementing PyTorch Transformers from scratch, based on the groundbreaking "Attention is all you need" paper. Explore the original transformer architecture, starting with a detailed paper review and progressing through key components such as the attention mechanism, transformer blocks, encoder, and decoder. Learn how to assemble these elements to create a complete Transformer model, and gain practical insights through a small example and error-fixing session. Benefit from additional resources, including recommended courses and free materials, to further enhance your understanding of machine learning, deep learning, and natural language processing.

Syllabus

- Introduction
- Paper Review
- Attention Mechanism
- TransformerBlock
- Encoder
- DecoderBlock
- Decoder
- Putting it togethor to form The Transformer
- A Small Example
- Fixing Errors
- Ending


Taught by

Aladdin Persson

Related Courses

Natural Language Generation in Python
DataCamp
Machine Translation with Keras
DataCamp
Pytorch Seq2Seq Tutorial for Machine Translation
Aladdin Persson via YouTube
Region Mutual Information Loss for Semantic Segmentation
University of Central Florida via YouTube
Action Recognition, Temporal Localization and Detection in Videos
University of Central Florida via YouTube