A Bregman Learning Framework for Sparse Neural Networks
Offered By: Society for Industrial and Applied Mathematics via YouTube
Course Description
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a cutting-edge learning framework for sparse neural networks in this virtual seminar talk from the 30th Imaging & Inverse Problems (IMAGINE) OneWorld SIAM-IS series. Delve into Leon Bungert's presentation on a novel approach using stochastic Bregman iterations, which enables training of sparse neural networks through an inverse scale space method. Learn about the baseline LinBreg algorithm, its accelerated momentum version, and AdaBreg, a Bregmanized generalization of the Adam algorithm. Discover a statistically sound sparse parameter initialization strategy and gain insights into stochastic convergence analysis of loss decay, along with additional convergence proofs in the convex regime. Understand how this Bregman learning framework can be applied to Neural Architecture Search, potentially uncovering autoencoder architectures for denoising or deblurring tasks.
Syllabus
30th Imaging & Inverse Problems (IMAGINE) OneWorld SIAM-IS Virtual Seminar Series Talk
Taught by
Society for Industrial and Applied Mathematics
Related Courses
Machine Learning Modeling Pipelines in ProductionDeepLearning.AI via Coursera MLOps for Scaling TinyML
Harvard University via edX AutoML for Natural Language Processing - EACL 2013 Tutorial
Center for Language & Speech Processing(CLSP), JHU via YouTube AutoML Towards Deep Learning: Optimizing Neural Architectures and Hyperparameters
Toronto Machine Learning Series (TMLS) via YouTube Deep Learning for Mobile Devices
WeAreDevelopers via YouTube