Automatic Differentiation for Sparse Tensors
Offered By: ACM SIGPLAN via YouTube
Course Description
Overview
Explore a cutting-edge framework for efficient automatic differentiation of sparse tensors in this 15-minute conference talk presented at ACM SIGPLAN's CTSTA'23. Delve into the challenges posed by irregular sparsity patterns in data-intensive applications and discover how this novel approach overcomes substantial memory and computational overheads. Learn about the key aspects of the proposed framework, including a compilation pipeline that leverages two intermediate DSLs with AD-agnostic domain-specific optimizations and efficient C++ code generation. Gain insights into how this innovative solution outperforms state-of-the-art alternatives across various synthetic and real-world sparse tensor datasets, potentially revolutionizing the field of automatic differentiation for sparse tensor operations.
Syllabus
[CTSTA'23] Automatic Differentiation for Sparse Tensors
Taught by
ACM SIGPLAN
Related Courses
Introduction to Neural Networks and PyTorchIBM via Coursera Regression with Automatic Differentiation in TensorFlow
Coursera Project Network via Coursera Neural Network from Scratch in TensorFlow
Coursera Project Network via Coursera Customising your models with TensorFlow 2
Imperial College London via Coursera PyTorch Fundamentals
Microsoft via Microsoft Learn