YoVDO

Structure-Sensitive Dependency Learning in Recurrent Neural Networks - 2017

Offered By: Center for Language & Speech Processing(CLSP), JHU via YouTube

Tags

Long short-term memory (LSTM) Courses Cognitive Sciences Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the capabilities of recurrent neural networks (RNNs) in learning structure-sensitive dependencies from natural language corpora, focusing on English subject-verb number agreement. Delve into Tal Linzen's research examining LSTMs' ability to predict verb number in various sentence types, analyzing their internal representations and comparing their performance to human agreement attraction errors. Discover how the networks approximate syntactic structure in common sentences but struggle with complex constructions, highlighting the need for stronger inductive biases. Learn about the potential of multi-task learning to address these limitations and gain insights into using linguistic and psycholinguistic methods to evaluate "black-box" neural network models. This hour-long lecture, delivered by Assistant Professor Tal Linzen from Johns Hopkins University, offers valuable perspectives on the intersection of cognitive science, linguistics, and artificial intelligence in natural language processing.

Syllabus

Structure-Sensitive Dependency Learning in Recurrent Neural Networks -- Tal Linzen (JHU) - 2017


Taught by

Center for Language & Speech Processing(CLSP), JHU

Related Courses

Reinforcement Learning for Trading Strategies
New York Institute of Finance via Coursera
Natural Language Processing with Sequence Models
DeepLearning.AI via Coursera
Fake News Detection with Machine Learning
Coursera Project Network via Coursera
English/French Translator: Long Short Term Memory Networks
Coursera Project Network via Coursera
Text Classification Using Word2Vec and LSTM on Keras
Coursera Project Network via Coursera