Neural Nets for NLP 2018 - Neural Semantic Parsing
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Tree Structures of Syntax
Representations of Semantics
Meaning Representations
Example Special-purpose Representations
Example Query Tasks
Example Command and Control Tasks
Example Code Generation Tasks
A Better Attempt: Tree-based Parsing Models • Generate from top-down using hierarchical sequence- to-sequence model (Dong and Lapata 2016)
Code Generation: Handling Syntax • Code also has syntax, e.g. in form of Abstract Syntax Trees
Problem w/ Weakly Supervised Learning: Spurious Logical Forms . Sometimes you can get the right answer without actually doing the generalizable thing (Guu et al. 2017)
Meaning Representation Desiderata (Jurafsky and Martin 17.1)
First-order Logic
Abstract Meaning Representation (Banarescu et al. 2013)
Other Formalisms
Parsing to Graph Structures
Linearization for Graph Structures (Konstas et al. 2017)
CCG and CCG Parsing
Neural Module Networks: Soft Syntax-driven Semantics (Andreas et al. 2016) . Standard syntax semantic interfaces use symbolic representations . It is also possible to use syntax to guide structure of neural networks to learn semantics
Neural Models for Semantic Role Labeling . Simple model w/ deep highway LSTM tagger works well (Le et al. 2017)
Taught by
Graham Neubig
Related Courses
Language, Proof and LogicStanford University via edX Artificial Intelligence: Knowledge Representation And Reasoning
Indian Institute of Technology Madras via Swayam AI:Knowledge Representation and Reasoning
Indian Institute of Technology Madras via Swayam 人工智慧:搜尋方法與邏輯推論 (Artificial Intelligence - Search & Logic)
National Taiwan University via Coursera Semantics of First-Order Logic
Stanford University via edX