YoVDO

Neural Nets for NLP 2020: Conditioned Generation

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Natural Language Processing (NLP) Courses Search Algorithms Courses Language Models Courses Encoder-Decoder Models Courses

Course Description

Overview

Explore conditioned generation in neural networks for natural language processing through this comprehensive lecture from CMU's CS 11-747 course. Delve into encoder-decoder models, conditional generation techniques, and search algorithms. Examine ensembling methods, evaluation metrics, and various types of data used for conditioning. Learn about language models, including conditional and generative variants, and their applications. Understand the generation problem, sampling methods, and search strategies like greedy and beam search. Investigate log-linear interpolation, stacking, and evaluation paradigms including human evaluation and perplexity. Gain insights into the intricacies of passing hidden states and the differences between linear and log-linear approaches in neural network architectures for NLP tasks.

Syllabus

Language Models Language models are generative models of text
Conditioned Language Models
Conditional Language Models
One Type of Conditional Language Model Sutskever et al. 2014
How to Pass Hidden State?
The Generation Problem
Ancestral Sampling
Greedy Search
Beam Search
Log-linear Interpolation • Weighted combination of log probabilities, normalize
Linear or Log Linear?
Stacking
Basic Evaluation Paradigm
Human Evaluation
Perplexity


Taught by

Graham Neubig

Related Courses

Create Image Captioning Models
Google via Google Cloud Skills Boost
Create Image Captioning Models
Google Cloud via Coursera
Create Image Captioning Models - בעברית
Google Cloud via Coursera
Generative AI: Introduction to Large Language Models
LinkedIn Learning
Natural Language Processing on Google Cloud
Google Cloud via Coursera