On the Tradeoffs of State Space Models vs. Transformers
Offered By: Simons Institute via YouTube
Course Description
Overview
Explore the fundamental tradeoffs between State Space Models (SSMs) and Transformers in this 49-minute lecture by Albert Gu from Carnegie Mellon University. Gain a high-level overview of SSMs, a recently popular subquadratic alternative to Transformers in the field of computational modeling. Delve into the characteristics of these models and understand their implications for machine learning and artificial intelligence. Learn how SSMs offer potential advantages in terms of computational efficiency while considering their trade-offs compared to the widely-used Transformer architecture. Discover insights that can inform decisions about model selection and implementation in various AI applications.
Syllabus
On the Tradeoffs of State Space Models
Taught by
Simons Institute
Related Courses
Applied Deep Learning: Build a Chatbot - Theory, ApplicationUdemy Can Wikipedia Help Offline Reinforcement Learning? - Paper Explained
Yannic Kilcher via YouTube Infinite Memory Transformer - Research Paper Explained
Yannic Kilcher via YouTube Recurrent Neural Networks and Transformers
Alexander Amini via YouTube MIT 6.S191 - Recurrent Neural Networks
Alexander Amini via YouTube