Neuromorphic Engineering Algorithms for Edge ML and Spiking Neural Networks
Offered By: tinyML via YouTube
Course Description
Overview
Syllabus
Intro
JOHNS HOPKINS UNIVERSITY
Spiking for tinyML
Batch Normalization Through Time (BNTT) for Temporal Learning
BNTT: Energy Efficiency & Robustness
Training SNNs for edge with heterogeneous demands
Spike Activation Map (SAM) for interpretable SNN
Spiking neurons are binary units with timed outputs
End-to-end training is key for artificial neural networks
Solution: Replace the true gradient with a surrogate gradient
Surrogate gradients self-calibrate neuromorphic systems when they can access the analog substrate variables
Fluctuation-driven initialization and bio-inspired homeostatic plasticity ensure optimal initialization
Holomorphic Equilibrium Propagation Computes Exact Gradients Through Finite Size Oscillations
Technical Program Committee
Taught by
tinyML
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Natural Language Processing
Columbia University via Coursera Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent