YoVDO

Structure-Sensitive Dependency Learning in Recurrent Neural Networks - 2017

Offered By: Center for Language & Speech Processing(CLSP), JHU via YouTube

Tags

Long short-term memory (LSTM) Courses Cognitive Sciences Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the capabilities of recurrent neural networks (RNNs) in learning structure-sensitive dependencies from natural language corpora, focusing on English subject-verb number agreement. Delve into Tal Linzen's research examining LSTMs' ability to predict verb number in various sentence types, analyzing their internal representations and comparing their performance to human agreement attraction errors. Discover how the networks approximate syntactic structure in common sentences but struggle with complex constructions, highlighting the need for stronger inductive biases. Learn about the potential of multi-task learning to address these limitations and gain insights into using linguistic and psycholinguistic methods to evaluate "black-box" neural network models. This hour-long lecture, delivered by Assistant Professor Tal Linzen from Johns Hopkins University, offers valuable perspectives on the intersection of cognitive science, linguistics, and artificial intelligence in natural language processing.

Syllabus

Structure-Sensitive Dependency Learning in Recurrent Neural Networks -- Tal Linzen (JHU) - 2017


Taught by

Center for Language & Speech Processing(CLSP), JHU

Related Courses

The Brain-Targeted Teaching® Model for 21st Century Schools
Johns Hopkins University via Coursera
Chinese Thought: Ancient Wisdom Meets Modern Science
The University of British Columbia via edX
Language and society
Indian Institute of Technology Madras via Swayam
Minds and Machines
Massachusetts Institute of Technology via edX
人とロボットが共生する未来社会 (ga018)
Osaka University via gacco