YoVDO

Understanding BigBird - Transformers for Longer Sequences

Offered By: Abhishek Thakur via YouTube

Tags

Natural Language Processing (NLP) Courses Machine Learning Courses Transformer Models Courses

Course Description

Overview

Explore the intricacies of BigBird, a transformer model designed for longer sequences, in this informative 27-minute video lecture by Vasudev Gupta. Delve into the reasons behind the need for models like BigBird and Longformer, and understand their advantages over traditional BERT and RoBERTa models for tasks involving extended text. Learn about BigBird's implementation, compare its complexity to BERT, and discover when to opt for each model type. Gain practical insights on optimizing BigBird's performance using Hugging Face, and understand its superiority over Longformer. The lecture covers topics such as block sparse attention, token handling, code implementation, complexity analysis, training techniques, limitations, and support for various BigBird variants, including BigBird Pegasus.

Syllabus

Introduction
Bird Model
Tokens
Code
Complexity
BigBird
Training BigBird
BigBird Limitations
BigBird Support
BigBird Pegasus
Questions


Taught by

Abhishek Thakur

Related Courses

Sequence Models
DeepLearning.AI via Coursera
Modern Natural Language Processing in Python
Udemy
Stanford Seminar - Transformers in Language: The Development of GPT Models Including GPT-3
Stanford University via YouTube
Long Form Question Answering in Haystack
James Briggs via YouTube
Spotify's Podcast Search Explained
James Briggs via YouTube