YoVDO

Neural Nets for NLP 2020 - Search-Based Structured Prediction

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Natural Language Processing (NLP) Courses Algorithm Design Courses Probabilistic Models Courses

Course Description

Overview

Explore search-based structured prediction in natural language processing through this comprehensive lecture from CMU's Neural Networks for NLP course. Delve into the Structured Perceptron algorithm, examining its simplicity in training non-probabilistic global models. Contrast perceptron and global normalization approaches, and investigate structured training techniques. Learn about cost-augmented hinge loss and its application to sequence modeling. Gain insights into addressing exposure bias with simple remedies. Understand the intricacies of structured max-margin objectives and their role in NLP tasks. Discover how corrupt training data impacts model performance and explore strategies to mitigate its effects.

Syllabus

Intro
Types of Prediction
Two Methods for Approximation
Structured Perceptron Loss
The Structured Perceptron Algorithm . An extremely simple way of training (non-probabilistic) global models . Find the one-best, and if it's score is better than the correct answer adjust parameters to fix this
Contrasting Perceptron and Global Normalization • Globally normalized probabilistic model
Structured Training and Pre-training
Cost-augmented Hinge
Costs over Sequences
Corrupt Training Data


Taught by

Graham Neubig

Related Courses

Fundamentals of Quantitative Modeling
University of Pennsylvania via Coursera
Теория вероятностей – наука о случайности
Tomsk State University via Stepik
Statistics and Data Science
Massachusetts Institute of Technology via edX
Natural Language Processing with Probabilistic Models
DeepLearning.AI via Coursera
Natural Language Processing
DeepLearning.AI via Coursera