PET - Optimizing Tensor Programs with Partially Equivalent Transformations and Automated Corrections
Offered By: USENIX via YouTube
Course Description
Overview
Explore a groundbreaking approach to optimizing tensor programs in this 15-minute conference talk from OSDI '21. Dive into PET (Partially Equivalent Transformations), a novel DNN framework that revolutionizes program optimization by applying transformations that maintain partial functional equivalence. Learn how PET automatically corrects results to restore full equivalence, unlocking previously missed optimization opportunities. Discover the rigorous theoretical foundations behind simplifying equivalence examination and correction, and understand the efficient search algorithm that combines fully and partially equivalent optimizations at multiple levels. Gain insights into PET's superior performance compared to existing systems, with improvements of up to 2.5 times. Examine key challenges, the mutant generator concept, multi-linear tensor programs, and the program optimizer in this cutting-edge presentation on enhancing deep neural network efficiency.
Syllabus
Intro
Tensor Program Transformations
Current Systems Consider only Fully Equivalent Transformations
Motivating Example
PET Overview
Key Challenges
Mutant Generator
Challenges: Examine Transformations
A Strawman Approach
Multi-Linear Tensor Program (MLTP)
Mutant Corrector
Program Optimizer
More Evaluation in Paper
Taught by
USENIX
Related Courses
GraphX - Graph Processing in a Distributed Dataflow FrameworkUSENIX via YouTube Theseus - An Experiment in Operating System Structure and State Management
USENIX via YouTube RedLeaf - Isolation and Communication in a Safe Operating System
USENIX via YouTube Microsecond Consensus for Microsecond Applications
USENIX via YouTube KungFu - Making Training in Distributed Machine Learning Adaptive
USENIX via YouTube