CMU Advanced NLP: Document-Level Modeling
Offered By: Graham Neubig via YouTube
Course Description
Overview
Explore advanced natural language processing techniques for document-level modeling in this comprehensive lecture from CMU's Advanced NLP course. Delve into extracting features from long sequences, coreference resolution, and discourse parsing. Learn about various encoding methods, including self-attention and Transformer models, as well as efficient approaches like sparse and adaptive span transformers. Discover techniques for entity coreference, including mention detection and pair models. Examine neural models for discourse parsing and gain insights into evaluating document-level language models. Access additional resources and materials through the provided class website to further enhance your understanding of these advanced NLP concepts.
Syllabus
Intro
Documentlevel Language Modeling
Recurrent Neural Networks
Encoding Methods
Self Attendance
Transformer Excel
Compressive Transformer
Sparse Transformer
Adaptive Span Transformer
Sparse Computations
Reformer
Low Rank Approximation
Evaluation
Entity Coreference
Mention Detection
Components
Instances
Pair Models
Coreference
Coreference model
Coreference models
Discourse parsing
Neural models
Taught by
Graham Neubig
Related Courses
Sequence ModelsDeepLearning.AI via Coursera Modern Natural Language Processing in Python
Udemy Stanford Seminar - Transformers in Language: The Development of GPT Models Including GPT-3
Stanford University via YouTube Long Form Question Answering in Haystack
James Briggs via YouTube Spotify's Podcast Search Explained
James Briggs via YouTube