YoVDO

Computational Benefits and Limitations of Transformers and State-Space Models

Offered By: Simons Institute via YouTube

Tags

Transformers Courses Long short-term memory (LSTM) Courses Computational Models Courses Language Models Courses Sequence Modeling Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the computational advantages and constraints of Transformers and state-space models in this 51-minute lecture by Eran Malach from the Kempner Institute at Harvard University. Delve into the mechanisms enabling retrieval, copying, and length generalization in language models, and examine how network architecture choices impact model performance on fundamental tasks. Discover theoretical and empirical evidence highlighting Transformers' superiority in copying and retrieval tasks compared to LSTM and state-space models like Mamba. Learn how Transformers' ability to copy long sequences can be harnessed for length generalization across various algorithmic and arithmetic tasks, providing valuable insights into the capabilities and limitations of different sequence modeling architectures.

Syllabus

Computational Benefits and Limitations of Transformers and State-Space Models


Taught by

Simons Institute

Related Courses

Reinforcement Learning for Trading Strategies
New York Institute of Finance via Coursera
Natural Language Processing with Sequence Models
DeepLearning.AI via Coursera
Fake News Detection with Machine Learning
Coursera Project Network via Coursera
English/French Translator: Long Short Term Memory Networks
Coursera Project Network via Coursera
Text Classification Using Word2Vec and LSTM on Keras
Coursera Project Network via Coursera