CMU Neural Nets for NLP 2018 - Document-Level Models
Offered By: Graham Neubig via YouTube
Course Description
Overview
Explore document-level models in natural language processing through this comprehensive lecture from Carnegie Mellon University's Neural Networks for NLP course. Delve into topics such as language modeling, long-term dependencies, topic modeling, and coreference resolution. Learn about entity mention models, entity-centric models, and complex features in coreference. Examine discourse parsing techniques, including shift-reduce parsers and recursive models. Understand the importance of coreference in language modeling and its applications in discourse analysis. Investigate document classification methods and their accuracy. Gain insights into advanced NLP concepts and techniques for processing and analyzing entire documents.
Syllabus
Documentlevel Models
Recap
Tasks over documents
Language modeling
Longterm dependencies
Topic modeling
Evaluation
Coreference
Mention Detection
Model Components
Entity Mention Models
EntityCentric Models
Complex Features
Advantages
Coreference Resolution
Questions
Cluster level features
Model overview
Inference model
Why do I need coreference
Language modeling with coreference
Discourse parsing
Course parsing
Shift reduce parser
Discrete features
Recursive models
Complex models
Discourse relations
Discourse parse
Discourse dependency structure
Document classification
Document classification accuracy
Taught by
Graham Neubig
Related Courses
Artificial Intelligence in Social Media AnalyticsJohns Hopkins University via Coursera Introduction to Natural Language Processing in R
DataCamp Introduction to Text Analysis in R
DataCamp Topic Modeling in R
DataCamp CCAI Insights
Google via Google Cloud Skills Boost