Word2Vec- Distributed Representations of Words and Phrases and Their Compositionality
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore the influential Word2Vec technique for generating distributed word representations in this comprehensive video lecture. Delve into the Skip-Gram model, hierarchical softmax, and negative sampling methods. Learn about the mysterious 3/4 power, frequent word subsampling, and their impact on training efficiency. Examine empirical results and gain insights into the practical applications of Word2Vec in modern natural language processing. Understand how this technique captures syntactic and semantic word relationships, and discover its limitations in representing word order and idiomatic phrases.
Syllabus
- Intro & Outline
- Distributed Word Representations
- Skip-Gram Model
- Hierarchical Softmax
- Negative Sampling
- Mysterious 3/4 Power
- Frequent Words Subsampling
- Empirical Results
- Conclusion & Comments
Taught by
Yannic Kilcher
Related Courses
Interactive Word Embeddings using Word2Vec and PlotlyCoursera Project Network via Coursera Машинное обучение на больших данных
Higher School of Economics via Coursera Generating discrete sequences: language and music
Ural Federal University via edX Explore Deep Learning for Natural Language Processing
Salesforce via Trailhead Advanced NLP with Python for Machine Learning
LinkedIn Learning