Neural Nets for NLP 2021 - Document-Level Models
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Some NLP Tasks we've Handled
Some Connections to Tasks over Documents
Document Level Language Modeling
Remember: Modeling using Recurrent Networks
Simple: Infinitely Pass State
Separate Encoding for Coarse- grained Document Context
Self-attention/Transformers Across Sentences
Transformer-XL: Truncated BPTT+Transformer
Adaptive Span Transformers
Reformer: Efficient Adaptively Sparse Attention
How to Evaluate Document- level Models?
Document Problems: Entity Coreference
Mention(Noun Phrase) Detection
Components of a Coreference Model
Coreference Models:Instances
Mention Pair Models
Entity Models: Entity-Mention Models
Advantages of Neural Network Models for Coreference
End-to-End Neural Coreference (Span Model)
End-to-End Neural Coreference (Coreference Model)
Using Coreference in Neural Models
Discourse Parsing w/ Attention- based Hierarchical Neural Networks
Uses of Discourse Structure in Neural Models
Taught by
Graham Neubig
Related Courses
Linear CircuitsGeorgia Institute of Technology via Coursera مقدمة في هندسة الطاقة والقوى
King Abdulaziz University via Rwaq (رواق) Magnetic Materials and Devices
Massachusetts Institute of Technology via edX Linear Circuits 2: AC Analysis
Georgia Institute of Technology via Coursera Transmisión de energía eléctrica
Tecnológico de Monterrey via edX