A Roadmap for Reverse-Architecting the Brain’s Neocortex
Offered By: Association for Computing Machinery (ACM) via YouTube
Course Description
Overview
Explore a comprehensive roadmap for reverse-architecting the brain's neocortex in this thought-provoking conference talk by James E. Smith from the University of Wisconsin-Madison. Delve into the unconventional approach of reconstructing the computational paradigms used in the neocortex, starting with the end product and working backwards. Learn about the meta-architecture framework, natural layers of abstraction, and key milestones in developing biologically plausible neural networks capable of unsupervised, continual learning. Discover the challenges and opportunities in moving from biological electronics to computational primitives and functional building blocks. Gain insights into the potential for innovation in this wide-open research space and understand how reverse-architecting higher-level cognitive functions could shape the future of computer architecture research for decades to come.
Syllabus
Introduction
Thank you
Roadmap
Temporal Neural Network
Outline
Neocortex
Physical Architecture
Neurons
Excitatory neuron model
Architecture
Long Term Roadmap
Spikes
Flow of Time
Rate Coding
Temporal Network
Neural Network Taxonomy
Inhibition
Timingdependent plasticity
Decision trees
Simple example
Analog circuit
Pantheon of Neuroscience Architects
Computational Column
Cluster IDs
Waypoints
Outputs
Results
Cookie Cutter Column
Research Space
Neural Networks
Temporal Algebra
Closing Remarks
Are we at a tipping point
Bibliography
Audience Questions
Taught by
Association for Computing Machinery (ACM)
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX