YoVDO

Advanced NLP 2022: Modeling Long Sequences

Offered By: Graham Neubig via YouTube

Tags

Natural Language Processing (NLP) Courses Feature Extraction Courses

Course Description

Overview

Explore advanced techniques for modeling long sequences in natural language processing through this comprehensive lecture from CMU's Advanced NLP course. Delve into extracting features from extended text and tackling document processing tasks. Learn about various transformer architectures including Transformer XL, Compressive Transformers, and Sparse Transformers. Examine adaptive span and sparse span approaches, as well as the Reformer model. Investigate low rank approximation and sparse attention methods. Gain insights into evaluation techniques and other relevant methodologies. Conclude with an overview of coreference models, including mention pair models and their components.

Syllabus

Introduction
NLP Tasks
Modeling Long Sequences
Separate Encoding
Selfattention Transformers
Transformer XL
Compressive Transformers
Sparse Transformers
Adaptive Span Transformers
Sparse Span Transformers
Reformer Model
Low Rank Approximation
Sparse Attention
Evaluation
Other Methods
Questions
Components of Coreference Models
Mention Pair Models
Model


Taught by

Graham Neubig

Related Courses

Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Einführung in Computer Vision
Technische Universität München (Technical University of Munich) via Coursera
機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera
Machine Learning for Musicians and Artists
Goldsmiths University of London via Kadenze
Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera