Neural Nets for NLP 2021 - Conditioned Generation
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Language Models • Language models are generative models of text
Conditioned Language Models
Calculating the Probability of a Sentence
Conditional Language Models
One Type of Language Model Mikolov et al. 2011
How to Pass Hidden State?
The Generation Problem
Ancestral Sampling
Greedy Search
Beam Search
Ensembling . Combine predictions from multiple models
Linear Interpolation • Take a weighted average of the M model probabilities
Log-linear Interpolation • Weighted combination of log probabilities, normalize
Linear or Log Linear?
Parameter Averaging
Ensemble Distillation (e.g. Kim et al. 2016)
Stacking
Still a Difficult Problem!
From Speaker/Document Traits (Hoang et al. 2016)
From Lists of Traits (Kiddon et al. 2016)
From Word Embeddings (Noraset et al. 2017)
Basic Evaluation Paradigm
Human Evaluation Shared Tasks
Embedding-based Metrics
Perplexity
Which One to Use?
Taught by
Graham Neubig
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologías del lenguaje humano
Universidad de Alicante via Miríadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam