A Unifying Framework for Memory and Abstraction
Offered By: Artem Kirsanov via YouTube
Course Description
Overview
Explore the Tolman-Eichenbaum Machine, a computational model unifying memory and spatial navigation in the hippocampal formation. Delve into the model's architecture, including position and memory modules, and understand its step-by-step operation. Examine the model's performance, cellular representations, and its ability to predict remapping laws. Learn how this framework relates to Transformer networks and gain insights into cognitive map building. Discover the connections between computational neuroscience, artificial intelligence, and our understanding of memory and spatial navigation in this informative 24-minute video lecture.
Syllabus
- Introduction
- Motivation: Agents, Rewards and Actions
- Prediction Problem
- Model architecture
- Position module
- Memory module
- Running TEM step-by-step
- Model performance
- Cellular representations
- TEM predicts remapping laws
- Recap and Acknowledgments
- TEM as a Transformer network
- Brilliant
- Outro
Taught by
Artem Kirsanov
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Artificial Intelligence for Robotics
Stanford University via Udacity Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent