YoVDO

Neural Nets for NLP 2017 - A Simple Exercise - Predicting the Next Word in a Sentence

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Natural Language Processing (NLP) Courses Text Analysis Courses Language Models Courses Loss Functions Courses Word Vectors Courses

Course Description

Overview

Explore a comprehensive lecture on neural networks for natural language processing, focusing on predicting the next word in a sentence. Delve into topics such as describing words by their context, counting and prediction techniques, skip-grams and Continuous Bag of Words (CBOW) models, and methods for evaluating and visualizing word vectors. Learn about advanced techniques for word vector creation and gain practical insights through provided slides and code examples. This lecture, part of CMU's CS 11-747 course, offers a deep dive into the fundamentals of word embeddings and their applications in NLP tasks.

Syllabus

Introduction
Evaluation
Language Models
Feature Eyes Models
Example
Converting Scores to Probabilities
Computation Graph
Lookup
Loss Function
Perimeter Update
Unknown Words
Vocabulary
Unfair Advantage
Problems of Previous Model
Neural Language Models
Code
InputOutput Embedding
Training Tricks
Learning Rate Decay


Taught by

Graham Neubig

Related Courses

DCO042 - Python For Informatics
University of Michigan via Independent
Corpus Linguistics: Method, Analysis, Interpretation
Lancaster University via FutureLearn
日本中世の自由と平等 (ga001)
University of Tokyo via gacco
"A Study in Scarlet" by Doyle: BerkeleyX Book Club
University of California, Berkeley via edX
"A Room with a View" by Forster: BerkeleyX Book Club
University of California, Berkeley via edX