Efficient CHAD - Optimizing Combinatory Homomorphic Automatic Differentiation
Offered By: ACM SIGPLAN via YouTube
Course Description
Overview
Explore an optimized approach to Combinatory Homomorphic Automatic Differentiation (CHAD) in this 19-minute conference talk from POPL 2024. Discover how researchers from Utrecht University have enhanced the basic CHAD algorithm using well-known methods to create a simple, composable, and widely applicable reverse-mode automatic differentiation technique. Learn about the implementation of sparse vectors, state-passing style code, and functional mutable updates to achieve the correct computational complexity expected in reverse-mode AD. Examine the Agda formalization of the complexity proof and understand how these techniques can be applied to differentiate parallel functional array programs. Gain insights into preserving task-parallelism and writing data-parallel derivatives for array primitives. Access the accompanying article and supplementary archive for a deeper dive into this research on efficient automatic differentiation in functional programming.
Syllabus
[POPL'24] Efficient CHAD
Taught by
ACM SIGPLAN
Related Courses
Introduction to Neural Networks and PyTorchIBM via Coursera Regression with Automatic Differentiation in TensorFlow
Coursera Project Network via Coursera Neural Network from Scratch in TensorFlow
Coursera Project Network via Coursera Customising your models with TensorFlow 2
Imperial College London via Coursera PyTorch Fundamentals
Microsoft via Microsoft Learn