YoVDO

Neural Nets for NLP - Transition-based Parsing

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Natural Language Processing (NLP) Courses

Course Description

Overview

Learn about transition-based parsing in natural language processing through this comprehensive lecture from CMU's Neural Networks for NLP course. Explore the fundamentals of shift-reduce parsing with feed-forward networks, stack LSTMs, and transition-based models for phrase structure. Discover the concept of linearized trees as a simple alternative approach. Gain insights into various parsing techniques, including arc standard shift-reduce parsing, recursive neural networks, and methods for encoding parsing configurations. Understand the importance of tree structures in NLP and how to make classification decisions in parsing tasks. Delve into advanced topics such as non-linear functions and alternative transition methods to enhance your understanding of neural network-based parsing techniques.

Syllabus

Intro
Two Types of Linguistic Structure
Why Dependencies?
Arc Standard Shift-Reduce Parsing (Yamada & Matsumoto 2003, Nivre 2003)
Shift Reduce Example
Classification for Shift-reduce
Making Classification Decisions
Non-linear Function: Cube Function
Why Tree Structure?
Recursive Neural Networks (Socher et al. 2011)
Encoding Parsing Configurations w/ RNNS
Alternative Transition Methods
Shift-reduce Parsing for Phrase Structure (Sagae and Lavie 2006. Watanabe 2015) . Shift, reduce X (binary), unary-X (unary) where X is a label
A Simple Approximation: Linearized Trees (Vinyals et al. 2015)


Taught by

Graham Neubig

Related Courses

Natural Language Processing
Columbia University via Coursera
Natural Language Processing
Stanford University via Coursera
Introduction to Natural Language Processing
University of Michigan via Coursera
moocTLH: Nuevos retos en las tecnologĂ­as del lenguaje humano
Universidad de Alicante via MirĂ­adax
Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam