Neural Nets for NLP 2019 - Word Vectors
Offered By: Graham Neubig via YouTube
Course Description
Overview
Syllabus
Intro
What do we want to know about words?
A Manual Attempt: WordNet
An Answer (?): Word Embeddings!
Word Embeddings are Cool! (An Obligatory Slide)
How to Train Word Embeddings?
Distributional Representations (see Goldberg 10.4.1)
Count-based Methods
Word Embeddings from Language Models
Context Window Methods
Skip-gram (Mikolov et al. 2013) • Predict each word in the context given the word
Count-based and Prediction-based Methods
Glove (Pennington et al. 2014)
What Contexts?
Types of Evaluation
Extrinsic Evaluation: Using Word Embeddings in Systems
Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
Limitations of Embeddings
Sub-word Embeddings (1)
Multi-prototype Embeddings • Simple idea, words with multiple meanings should have different embeddings (Reisinger and Mooney 2010)
Taught by
Graham Neubig
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologías del lenguaje humano
Universidad de Alicante via Miríadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam