Neural Nets for NLP 2021 - Document-Level Models
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Some NLP Tasks we've Handled
Some Connections to Tasks over Documents
Document Level Language Modeling
Remember: Modeling using Recurrent Networks
Simple: Infinitely Pass State
Separate Encoding for Coarse- grained Document Context
Self-attention/Transformers Across Sentences
Transformer-XL: Truncated BPTT+Transformer
Adaptive Span Transformers
Reformer: Efficient Adaptively Sparse Attention
How to Evaluate Document- level Models?
Document Problems: Entity Coreference
Mention(Noun Phrase) Detection
Components of a Coreference Model
Coreference Models:Instances
Mention Pair Models
Entity Models: Entity-Mention Models
Advantages of Neural Network Models for Coreference
End-to-End Neural Coreference (Span Model)
End-to-End Neural Coreference (Coreference Model)
Using Coreference in Neural Models
Discourse Parsing w/ Attention- based Hierarchical Neural Networks
Uses of Discourse Structure in Neural Models
Taught by
Graham Neubig
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX