Computational Principles Underlying the Learning of Sensorimotor Repertoires
Offered By: Massachusetts Institute of Technology via YouTube
Course Description
Overview
Explore computational principles underlying sensorimotor learning in this lecture by Daniel Wolpert from MIT. Delve into the segmentation of continuous sensorimotor experiences into separate memories and the adaptation of growing motor repertoires. Examine the role of context in activating motor memories and how statistical learning leads to multimodal object representations. Discover a principled theory of motor learning based on contextual inference, challenging dominant single-context learning theories. Learn how this model explains key features of motor learning and predicts novel phenomena, confirmed through experiments. Investigate topics such as followthrough, tool use, control points, spontaneous recovery, memory updating, single-trial learning, savings, and explicit-implicit learning. Gain insights from Wolpert's extensive research background in neuroscience, engineering, and physiology, and understand the implications of contextual inference as a fundamental principle in motor behavior.
Syllabus
Intro
Overview
Question
Rules for motor memories
Followthrough
Followthrough replication
Tool use
Learning tools
Control points
generative model
model details
toy problem
Spontaneous recovery
Memory updating
Single trial learning
Savings
Explicit implicit learning
Conclusion
Taught by
MIT Embodied Intelligence
Tags
Related Courses
Business Considerations for 5G with Edge, IoT, and AILinux Foundation via edX FinTech for Finance and Business Leaders
ACCA via edX AI-900: Microsoft Certified Azure AI Fundamentals
A Cloud Guru AWS Certified Machine Learning - Specialty (LA)
A Cloud Guru Azure AI Components and Services
A Cloud Guru