YoVDO

Computational Benefits and Limitations of Transformers and State-Space Models

Offered By: Simons Institute via YouTube

Tags

Transformers Courses Long short-term memory (LSTM) Courses Computational Models Courses Language Models Courses Sequence Modeling Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the computational advantages and constraints of Transformers and state-space models in this 51-minute lecture by Eran Malach from the Kempner Institute at Harvard University. Delve into the mechanisms enabling retrieval, copying, and length generalization in language models, and examine how network architecture choices impact model performance on fundamental tasks. Discover theoretical and empirical evidence highlighting Transformers' superiority in copying and retrieval tasks compared to LSTM and state-space models like Mamba. Learn how Transformers' ability to copy long sequences can be harnessed for length generalization across various algorithmic and arithmetic tasks, providing valuable insights into the capabilities and limitations of different sequence modeling architectures.

Syllabus

Computational Benefits and Limitations of Transformers and State-Space Models


Taught by

Simons Institute

Related Courses

Linear Circuits
Georgia Institute of Technology via Coursera
مقدمة في هندسة الطاقة والقوى
King Abdulaziz University via Rwaq (رواق)
Magnetic Materials and Devices
Massachusetts Institute of Technology via edX
Linear Circuits 2: AC Analysis
Georgia Institute of Technology via Coursera
Transmisión de energía eléctrica
Tecnológico de Monterrey via edX