Neural Nets for NLP 2021 - Conditioned Generation
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
Language Models • Language models are generative models of text
Conditioned Language Models
Calculating the Probability of a Sentence
Conditional Language Models
One Type of Language Model Mikolov et al. 2011
How to Pass Hidden State?
The Generation Problem
Ancestral Sampling
Greedy Search
Beam Search
Ensembling . Combine predictions from multiple models
Linear Interpolation • Take a weighted average of the M model probabilities
Log-linear Interpolation • Weighted combination of log probabilities, normalize
Linear or Log Linear?
Parameter Averaging
Ensemble Distillation (e.g. Kim et al. 2016)
Stacking
Still a Difficult Problem!
From Speaker/Document Traits (Hoang et al. 2016)
From Lists of Traits (Kiddon et al. 2016)
From Word Embeddings (Noraset et al. 2017)
Basic Evaluation Paradigm
Human Evaluation Shared Tasks
Embedding-based Metrics
Perplexity
Which One to Use?
Taught by
Graham Neubig
Related Courses
TensorFlow Developer Certificate Exam PrepA Cloud Guru Post Graduate Certificate in Advanced Machine Learning & AI
Indian Institute of Technology Roorkee via Coursera Advanced AI Techniques for the Supply Chain
LearnQuest via Coursera Advanced Learning Algorithms
DeepLearning.AI via Coursera IBM AI Engineering
IBM via Coursera