Neural Nets for NLP 2017 - Parsing With Dynamic Programs
Offered By: Graham Neubig via YouTube
Course Description
Overview
Explore parsing with dynamic programs in this lecture from CMU's Neural Networks for NLP course. Delve into graph-based parsing, minimum spanning tree parsing, and structured training techniques. Learn about dynamic programming methods for phrase structure parsing and reranking approaches. Examine algorithms like Tulio Edmunds and IceNurse, and understand the transition from traditional to neural models. Investigate global probabilistic training, CKY and Viterbi algorithms, and Conditional Random Fields (CRFs) for parsing. Gain insights into neural CRFs, structured inference, and recursive neural networks for parsing tasks.
Syllabus
Introduction
Linguistic Structure
Dynamic Programming Based Models
Minimum Spanning Tree
Graph Based vs Transition Based
Tulio Edmunds Algorithm
IceNurse Algorithm
Quiz
Before Neural Nets
Higher Order Dependency Parsing
Neural Models
Motivation
Model
Example
Global probabilistic training
Code example
Algorithms
Phrase Structures
Parsing vs Tagging
Hyper Graph Edges
Scoring Edges
CKY Algorithm
Viterbi Algorithm
Over Graphs
CRF
CRF Example
CRF Over Trees
Neural CRF
Inference
Parsing
Structured Inference
Recursive Neural Networks
Rear Inking
Rear Inking Results
Next Time
Taught by
Graham Neubig
Related Courses
Algorithms: Design and Analysis, Part 2Stanford University via Coursera Discrete Optimization
University of Melbourne via Coursera Conception et mise en œuvre d'algorithmes.
École Polytechnique via Coursera Computability, Complexity & Algorithms
Georgia Institute of Technology via Udacity Discrete Inference and Learning in Artificial Vision
École Centrale Paris via Coursera