YoVDO

CMU Neural Nets for NLP 2018 - Models of Words

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Natural Language Processing (NLP) Courses Word Embeddings Courses

Course Description

Overview

Explore the fundamentals of word representations in natural language processing through this comprehensive lecture from Carnegie Mellon University's Neural Networks for NLP course. Delve into various approaches for modeling words, from manual attempts like WordNet to modern word embedding techniques. Examine the differences between distributional and distributed representations, and learn about count-based and prediction-based methods for training word embeddings. Investigate the importance of context in word representations and discover different evaluation techniques for assessing embedding quality. Analyze the strengths and limitations of word embeddings, and gain insights into choosing appropriate embeddings for specific tasks. Conclude by exploring sub-word embedding techniques to address limitations of traditional word-level representations.

Syllabus

Intro
What do we want to know about words?
A Manual Attempt: WordNet
An Answer (?): Word Embeddings!
How to Train Word Embeddings?
Distributional vs. Distributed Representations
Count-based Methods
Distributional Representations (see Goldberg 10.4.1) • Words appear in a context
Context Window Methods
Count-based and Prediction-based Methods
Glove (Pennington et al. 2014)
What Contexts?
Types of Evaluation
Non-linear Projection • Non-linear projections group things that are close in high- dimensional space eg. SNEA-SNE (van der Masten and Hinton 2008) group things that give each other a high probability according to a Gaussian
t-SNE Visualization can be Misleading! (Wattenberg et al. 2016)
Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
Extrinsic Evaluation: Using Word Embeddings in Systems
How Do I Choose Embeddings?
When are Pre-trained Embeddings Useful?
Limitations of Embeddings
Sub-word Embeddings (1)


Taught by

Graham Neubig

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX