On the Tradeoffs of State Space Models vs. Transformers
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the fundamental tradeoffs between State Space Models (SSMs) and Transformers in this 49-minute lecture by Albert Gu from Carnegie Mellon University. Gain a high-level overview of SSMs, a recently popular subquadratic alternative to Transformers in the field of computational modeling. Delve into the characteristics of these models and understand their implications for machine learning and artificial intelligence. Learn how SSMs offer potential advantages in terms of computational efficiency while considering their trade-offs compared to the widely-used Transformer architecture. Discover insights that can inform decisions about model selection and implementation in various AI applications.
Syllabus
On the Tradeoffs of State Space Models
Taught by
Simons Institute
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX