YoVDO

Neural Nets for NLP 2017 - Conditioned Generation

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Natural Language Processing (NLP) Courses Language Models Courses Encoder-Decoder Models Courses

Course Description

Overview

Explore a comprehensive lecture on conditioned generation in neural networks for natural language processing. Delve into encoder-decoder models, conditional generation techniques, and search algorithms. Learn about ensembling methods, evaluation strategies, and various types of data used for conditioning. Access accompanying slides and code examples for hands-on learning. Gain insights into language models, the generation problem, and evaluation paradigms, including human evaluation and perplexity. Part of CMU's Neural Networks for NLP course, this lecture provides essential knowledge for understanding and implementing advanced NLP techniques.

Syllabus

Intro
Language Models • Language models are generative models of text
Conditioned Language Models
Conditional Language Models
One Type of Conditional Language Model Sutskever et al. 2014
The Generation Problem
Ancestral Sampling
Greedy Search
Ensembling Combine predictions from multiple models
Log-linear Interpolation Weighted combination of log probabilities, normalize
Linear or Log Linear?
Parameter Averaging
Stacking
Basic Evaluation Paradigm
Human Evaluation
Perplexity


Taught by

Graham Neubig

Related Courses

Natural Language Processing on Google Cloud
Google Cloud via Coursera
MIT 6.S191 - Automatic Speech Recognition
Alexander Amini via YouTube
Introduction to T5 for Sentiment Span Extraction
Abhishek Thakur via YouTube
CMU Advanced NLP 2021 - Conditional Generation
Graham Neubig via YouTube
Neural Nets for NLP 2021 - Conditioned Generation
Graham Neubig via YouTube