YoVDO

Neural Nets for NLP - Structured Prediction Basics

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Natural Language Processing (NLP) Courses Model Training Courses Sequence Labeling Courses

Course Description

Overview

Explore structured prediction basics in this lecture from CMU's Neural Networks for NLP course. Delve into the Structured Perceptron algorithm, structured max-margin objectives, and simple remedies to exposure bias. Learn about various types of prediction, the importance of modeling output interactions, and training methods for structured models. Examine local normalization, global normalization, and cost-augmented decoding for Hamming loss. Gain insights into sequence labeling, tagger considerations for output structure, and the challenges associated with structured hinge loss.

Syllabus

Intro
A Prediction Problem
Types of Prediction
Why Call it "Structured" Prediction?
Many Varieties of Structured Prediction!
Sequence Labeling as
Sequence Labeling w
Why Model Interactions in Output? . Consistency is important!
A Tagger Considering Output Structure
Training Structured Models
Local Normalization and
The Structured Perceptron Algorithm . An extremely simple way of training (non-probabilistic) global models
Structured Perceptron Loss
Contrasting Perceptron and Global Normalization • Globally normalized probabilistic model
Structured Training and Pre-training
Cost-Augmented Decoding for Hamming Loss • Hamming loss is decomposable over each word • Solution: add a score - Cost to each incorrect choice during search
What's Wrong w/ Structured Hinge Loss?


Taught by

Graham Neubig

Related Courses

TensorFlow Developer Certificate Exam Prep
A Cloud Guru
Post Graduate Certificate in Advanced Machine Learning & AI
Indian Institute of Technology Roorkee via Coursera
Advanced AI Techniques for the Supply Chain
LearnQuest via Coursera
Advanced Learning Algorithms
DeepLearning.AI via Coursera
IBM AI Engineering
IBM via Coursera