Neural Nets for NLP 2018 - Neural Semantic Parsing
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Tree Structures of Syntax
Representations of Semantics
Meaning Representations
Example Special-purpose Representations
Example Query Tasks
Example Command and Control Tasks
Example Code Generation Tasks
A Better Attempt: Tree-based Parsing Models • Generate from top-down using hierarchical sequence- to-sequence model (Dong and Lapata 2016)
Code Generation: Handling Syntax • Code also has syntax, e.g. in form of Abstract Syntax Trees
Problem w/ Weakly Supervised Learning: Spurious Logical Forms . Sometimes you can get the right answer without actually doing the generalizable thing (Guu et al. 2017)
Meaning Representation Desiderata (Jurafsky and Martin 17.1)
First-order Logic
Abstract Meaning Representation (Banarescu et al. 2013)
Other Formalisms
Parsing to Graph Structures
Linearization for Graph Structures (Konstas et al. 2017)
CCG and CCG Parsing
Neural Module Networks: Soft Syntax-driven Semantics (Andreas et al. 2016) . Standard syntax semantic interfaces use symbolic representations . It is also possible to use syntax to guide structure of neural networks to learn semantics
Neural Models for Semantic Role Labeling . Simple model w/ deep highway LSTM tagger works well (Le et al. 2017)
Taught by
Graham Neubig
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologías del lenguaje humano
Universidad de Alicante via Miríadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam