YoVDO

Neural Nets for NLP 2020 - Search-Based Structured Prediction

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Natural Language Processing (NLP) Courses Algorithm Design Courses Probabilistic Models Courses

Course Description

Overview

Explore search-based structured prediction in natural language processing through this comprehensive lecture from CMU's Neural Networks for NLP course. Delve into the Structured Perceptron algorithm, examining its simplicity in training non-probabilistic global models. Contrast perceptron and global normalization approaches, and investigate structured training techniques. Learn about cost-augmented hinge loss and its application to sequence modeling. Gain insights into addressing exposure bias with simple remedies. Understand the intricacies of structured max-margin objectives and their role in NLP tasks. Discover how corrupt training data impacts model performance and explore strategies to mitigate its effects.

Syllabus

Intro
Types of Prediction
Two Methods for Approximation
Structured Perceptron Loss
The Structured Perceptron Algorithm . An extremely simple way of training (non-probabilistic) global models . Find the one-best, and if it's score is better than the correct answer adjust parameters to fix this
Contrasting Perceptron and Global Normalization • Globally normalized probabilistic model
Structured Training and Pre-training
Cost-augmented Hinge
Costs over Sequences
Corrupt Training Data


Taught by

Graham Neubig

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX