YoVDO

Neural Nets for NLP 2017 - Neural Semantic Parsing

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Natural Language Processing (NLP) Courses Dynamic programming Courses

Course Description

Overview

Explore neural semantic parsing in this comprehensive lecture from CMU's Neural Networks for NLP course. Delve into graph-based parsing, minimum spanning tree parsing, and structured training techniques. Learn about dynamic programming methods for phrase structure parsing and reranking. Access accompanying slides and code examples to reinforce understanding of key concepts. Gain insights into semantic parsing approaches, including sequence-to-sequence models and tree-based parsing models. Examine various meaning representations such as first-order logic and Abstract Meaning Representation. Investigate syntax-driven semantic parsing, CCG parsing, and semantic role labeling with neural models.

Syllabus

Intro
Tree Structures of Syntax
Representations of Semantics
Meaning Representations
Example Special-purpose Representations
Example Query Tasks
A First Attempt: Sequence-to- sequence Models (Jia and Liang 2016)
A Better Attempt: Tree-based Parsing Models • Generate from top-down using hierarchical sequence- to-sequence model (Dong and Lapata 2016)
Meaning Representation Desiderata (Jurafsky and Martin 17.1)
First-order Logic
Abstract Meaning Representation (Banarescu et al. 2013)
Syntax-driven Semantic Parsing
CCG and CCG Parsing
Parsing to Graph Structures
Semantic Role Labeling (Gildea and Jurafsky 2002)
Neural Models for Semantic Role Labeling


Taught by

Graham Neubig

Related Courses

Natural Language Processing
Columbia University via Coursera
Natural Language Processing
Stanford University via Coursera
Introduction to Natural Language Processing
University of Michigan via Coursera
moocTLH: Nuevos retos en las tecnologías del lenguaje humano
Universidad de Alicante via Miríadax
Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam