YoVDO

Transfer Learning and Transformers - Full Stack Deep Learning - Spring 2021

Offered By: The Full Stack via YouTube

Tags

Deep Learning Courses Computer Vision Courses Transfer Learning Courses Transformers Courses Attention Mechanisms Courses Embeddings Courses Language Models Courses Positional Encoding Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the evolution of transfer learning in this 49-minute lecture from the Full Stack Deep Learning Spring 2021 series. Dive into the origins of transfer learning in computer vision and its application to Natural Language Processing through embeddings. Discover NLP's breakthrough moment with ELMO and ULMFit, and their impact on datasets like SQuAD, SNLI, and GLUE. Delve into the rise of Transformers, understanding key concepts such as masked self-attention, positional encoding, and layer normalization. Examine various Transformer variants including BERT, GPT series, DistillBERT, and T5. Witness impressive GPT-3 demonstrations and gain insights into future directions in the field of transfer learning and transformers.

Syllabus

- Introduction
- Transfer Learning in Computer Vision
- Embeddings and Language Models
- NLP's ImageNet moment: ELMO and ULMFit on datasets like SQuAD, SNLI, and GLUE
- Rise of Transformers
- Attention in Detail: Masked Self-Attention, Positional Encoding, and Layer Normalization
- Transformers Variants: BERT, GPT/GPT-2/GPT-3, DistillBERT, T5, etc.
- GPT3 Demos
- Future Directions


Taught by

The Full Stack

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera
Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera
Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera
Leading Ambitious Teaching and Learning
Microsoft via edX