Neuromorphic Engineering Algorithms for Edge ML and Spiking Neural Networks
Offered By: tinyML via YouTube
Course Description
Overview
Syllabus
Intro
JOHNS HOPKINS UNIVERSITY
Spiking for tinyML
Batch Normalization Through Time (BNTT) for Temporal Learning
BNTT: Energy Efficiency & Robustness
Training SNNs for edge with heterogeneous demands
Spike Activation Map (SAM) for interpretable SNN
Spiking neurons are binary units with timed outputs
End-to-end training is key for artificial neural networks
Solution: Replace the true gradient with a surrogate gradient
Surrogate gradients self-calibrate neuromorphic systems when they can access the analog substrate variables
Fluctuation-driven initialization and bio-inspired homeostatic plasticity ensure optimal initialization
Holomorphic Equilibrium Propagation Computes Exact Gradients Through Finite Size Oscillations
Technical Program Committee
Taught by
tinyML
Related Courses
UT.1.01x: Energy 101The University of Texas at Austin via edX Data Analytics in Business
IEEE via edX Power Up: English for the Energy Transition
Center for Technology Enhanced Learning via iversity Introduction to Sustainable Construction
Universidad de Cantabria via Miríadax Eficiencia energética en instalaciones de iluminación
Universitat Jaume I via Independent