YoVDO

Transfer Learning and Transformers - Full Stack Deep Learning - Spring 2021

Offered By: The Full Stack via YouTube

Tags

Deep Learning Courses Computer Vision Courses Transfer Learning Courses Transformers Courses Attention Mechanisms Courses Embeddings Courses Language Models Courses Positional Encoding Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the evolution of transfer learning in this 49-minute lecture from the Full Stack Deep Learning Spring 2021 series. Dive into the origins of transfer learning in computer vision and its application to Natural Language Processing through embeddings. Discover NLP's breakthrough moment with ELMO and ULMFit, and their impact on datasets like SQuAD, SNLI, and GLUE. Delve into the rise of Transformers, understanding key concepts such as masked self-attention, positional encoding, and layer normalization. Examine various Transformer variants including BERT, GPT series, DistillBERT, and T5. Witness impressive GPT-3 demonstrations and gain insights into future directions in the field of transfer learning and transformers.

Syllabus

- Introduction
- Transfer Learning in Computer Vision
- Embeddings and Language Models
- NLP's ImageNet moment: ELMO and ULMFit on datasets like SQuAD, SNLI, and GLUE
- Rise of Transformers
- Attention in Detail: Masked Self-Attention, Positional Encoding, and Layer Normalization
- Transformers Variants: BERT, GPT/GPT-2/GPT-3, DistillBERT, T5, etc.
- GPT3 Demos
- Future Directions


Taught by

The Full Stack

Related Courses

NeRF - Representing Scenes as Neural Radiance Fields for View Synthesis
Yannic Kilcher via YouTube
Perceiver - General Perception with Iterative Attention
Yannic Kilcher via YouTube
LambdaNetworks- Modeling Long-Range Interactions Without Attention
Yannic Kilcher via YouTube
Attention Is All You Need - Transformer Paper Explained
Aleksa Gordić - The AI Epiphany via YouTube
NeRFs- Neural Radiance Fields - Paper Explained
Aladdin Persson via YouTube