Neural Nets for NLP 2017 - Neural Semantic Parsing
Offered By: Graham Neubig via YouTube
Course Description
Overview
Explore neural semantic parsing in this comprehensive lecture from CMU's Neural Networks for NLP course. Delve into graph-based parsing, minimum spanning tree parsing, and structured training techniques. Learn about dynamic programming methods for phrase structure parsing and reranking. Access accompanying slides and code examples to reinforce understanding of key concepts. Gain insights into semantic parsing approaches, including sequence-to-sequence models and tree-based parsing models. Examine various meaning representations such as first-order logic and Abstract Meaning Representation. Investigate syntax-driven semantic parsing, CCG parsing, and semantic role labeling with neural models.
Syllabus
Intro
Tree Structures of Syntax
Representations of Semantics
Meaning Representations
Example Special-purpose Representations
Example Query Tasks
A First Attempt: Sequence-to- sequence Models (Jia and Liang 2016)
A Better Attempt: Tree-based Parsing Models • Generate from top-down using hierarchical sequence- to-sequence model (Dong and Lapata 2016)
Meaning Representation Desiderata (Jurafsky and Martin 17.1)
First-order Logic
Abstract Meaning Representation (Banarescu et al. 2013)
Syntax-driven Semantic Parsing
CCG and CCG Parsing
Parsing to Graph Structures
Semantic Role Labeling (Gildea and Jurafsky 2002)
Neural Models for Semantic Role Labeling
Taught by
Graham Neubig
Related Courses
Algorithms: Design and Analysis, Part 2Stanford University via Coursera Discrete Optimization
University of Melbourne via Coursera Conception et mise en œuvre d'algorithmes.
École Polytechnique via Coursera Computability, Complexity & Algorithms
Georgia Institute of Technology via Udacity Discrete Inference and Learning in Artificial Vision
École Centrale Paris via Coursera