YoVDO

CMU Neural Nets for NLP 2018 - Conditioned Generation

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Natural Language Processing (NLP) Courses Language Models Courses

Course Description

Overview

Explore conditioned generation in neural networks for natural language processing through this comprehensive lecture from Carnegie Mellon University's 2018 Neural Nets for NLP course. Delve into key concepts including language models, conditioned language models, and various generation techniques such as ancestral sampling and ensembling. Learn about evaluation methods like human evaluation and perplexity, and gain insights into the differences between linear and log-linear models. Discover advanced topics such as parameter averaging, ensemble distillation, and stacking. Conclude with a contrastive look at evaluating unconditioned generation, providing a well-rounded understanding of this crucial aspect of NLP.

Syllabus

Intro
Language Models Language models are generative models of text
Conditioned Language Models
Ancestral Sampling
Ensembling
Linear or Log Linear?
Parameter Averaging
Ensemble Distillation (e.g. Kim et al. 2016)
Stacking
Basic Evaluation Paradigm
Human Evaluation
Perplexity
A Contrastive Note: Evaluating Unconditioned Generation


Taught by

Graham Neubig

Related Courses

Building a unique NLP project: 1984 book vs 1984 album
Coursera Project Network via Coursera
Exam Prep AI-102: Microsoft Azure AI Engineer Associate
Whizlabs via Coursera
Amazon Echo Reviews Sentiment Analysis Using NLP
Coursera Project Network via Coursera
Amazon Translate: Translate documents with batch translation
Coursera Project Network via Coursera
Analyze Text Data with Yellowbrick
Coursera Project Network via Coursera