YoVDO

Neural Nets for NLP 2017 - Recurrent Neural Networks

Offered By: Graham Neubig via YouTube

Tags

Neural Networks Courses Deep Learning Courses Natural Language Processing (NLP) Courses Long short-term memory (LSTM) Courses

Course Description

Overview

Explore recurrent neural networks in this lecture from CMU's Neural Networks for NLP course. Dive into the fundamentals of recurrent networks, addressing challenges like vanishing gradients through LSTMs. Analyze the strengths and weaknesses of recurrence in sentence modeling and discover pre-training techniques for RNNs. Access accompanying slides and code examples to reinforce your understanding of key concepts including parameter tying, language modeling, sentence representation, and handling long sequences with mini-batching methods.

Syllabus

Intro
NLP and Sequential Data
Long-distance Dependencies in Language
Parameter Tying
What Can RNNs Do?
e.g. Language Modeling
Representing Sentences
Representing Contexts
Recurrent Neural Networks in DyNet
Parameter Initialization
Sentence Initialization
A Solution: Long Short-term Memory (Hochreiter and Schmichuber 1997)
Other Alternatives
Handling Mini-batching
Mini-batching Method
Handling Long Sequences
Example: LM - Sentence Classifier
LSTM Structure


Taught by

Graham Neubig

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
機器學習技法 (Machine Learning Techniques)
National Taiwan University via Coursera
Machine Learning Capstone: An Intelligent Application with Deep Learning
University of Washington via Coursera
Прикладные задачи анализа данных
Moscow Institute of Physics and Technology via Coursera
Leading Ambitious Teaching and Learning
Microsoft via edX