Neural Nets for NLP 2021 - Conditioned Generation
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Language Models • Language models are generative models of text
Conditioned Language Models
Calculating the Probability of a Sentence
Conditional Language Models
One Type of Language Model Mikolov et al. 2011
How to Pass Hidden State?
The Generation Problem
Ancestral Sampling
Greedy Search
Beam Search
Ensembling . Combine predictions from multiple models
Linear Interpolation • Take a weighted average of the M model probabilities
Log-linear Interpolation • Weighted combination of log probabilities, normalize
Linear or Log Linear?
Parameter Averaging
Ensemble Distillation (e.g. Kim et al. 2016)
Stacking
Still a Difficult Problem!
From Speaker/Document Traits (Hoang et al. 2016)
From Lists of Traits (Kiddon et al. 2016)
From Word Embeddings (Noraset et al. 2017)
Basic Evaluation Paradigm
Human Evaluation Shared Tasks
Embedding-based Metrics
Perplexity
Which One to Use?
Taught by
Graham Neubig
Related Courses
Natural Language Processing on Google CloudGoogle Cloud via Coursera MIT 6.S191 - Automatic Speech Recognition
Alexander Amini via YouTube Introduction to T5 for Sentiment Span Extraction
Abhishek Thakur via YouTube CMU Advanced NLP 2021 - Conditional Generation
Graham Neubig via YouTube Neural Nets for NLP 2020: Conditioned Generation
Graham Neubig via YouTube