YoVDO

BLASPhemy - Improving BLAS Handling in Enzyme.jl

Offered By: The Julia Programming Language via YouTube

Tags

Julia Courses Code Generation Courses Multithreading Courses Automatic Differentiation Courses LLVM Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the challenges and improvements in differentiating BLAS and Lapack routines within the Julia ecosystem. Learn about the limitations of Enzyme.jl in handling black-box implementations and the initial workarounds involving generic openBLAS fallbacks. Discover the innovative approach using LLVM's code-generation capabilities to generate efficient differentiation rules for low-level BLAS calls. Understand how these improvements significantly enhance BLAS AD performance, prevent crashes with large matrices, and enable support for hardware-specific, multithreaded BLAS libraries. Gain insights into ongoing work aimed at further performance optimization through memory management techniques and their impact on downstream Julia applications.

Syllabus

BLASPhemy | Sebastian Drehwald | JuliaCon 2024


Taught by

The Julia Programming Language

Related Courses

Introduction to Neural Networks and PyTorch
IBM via Coursera
Regression with Automatic Differentiation in TensorFlow
Coursera Project Network via Coursera
Neural Network from Scratch in TensorFlow
Coursera Project Network via Coursera
Customising your models with TensorFlow 2
Imperial College London via Coursera
PyTorch Fundamentals
Microsoft via Microsoft Learn