CMU Neural Nets for NLP 2018 - Models of Words
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
What do we want to know about words?
A Manual Attempt: WordNet
An Answer (?): Word Embeddings!
How to Train Word Embeddings?
Distributional vs. Distributed Representations
Count-based Methods
Distributional Representations (see Goldberg 10.4.1) • Words appear in a context
Context Window Methods
Count-based and Prediction-based Methods
Glove (Pennington et al. 2014)
What Contexts?
Types of Evaluation
Non-linear Projection • Non-linear projections group things that are close in high- dimensional space eg. SNEA-SNE (van der Masten and Hinton 2008) group things that give each other a high probability according to a Gaussian
t-SNE Visualization can be Misleading! (Wattenberg et al. 2016)
Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
Extrinsic Evaluation: Using Word Embeddings in Systems
How Do I Choose Embeddings?
When are Pre-trained Embeddings Useful?
Limitations of Embeddings
Sub-word Embeddings (1)
Taught by
Graham Neubig
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologías del lenguaje humano
Universidad de Alicante via Miríadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam