ReLU Hull Approximation - Fast and Precise Convex Hull Over-Approximation for Neural Network Verification
Offered By: ACM SIGPLAN via YouTube
Course Description
Overview
Explore a novel approach to over-approximating the convex hull of the ReLU function in neural network verification. Delve into the innovative WraLU method, which constructs a convex polytope to efficiently "wrap" the ReLU hull by reusing linear pieces and creating adjacent upper faces. Learn how this technique outperforms existing methods in precision, efficiency, and constraint complexity while addressing arbitrary input polytopes and higher-dimensional cases. Discover the impact of integrating WraLU into PRIMA, a state-of-the-art neural network verifier, and its effectiveness in verifying large-scale ReLU-based neural networks. Gain insights into how this approach significantly reduces the number of constraints for linear programming solvers while maintaining or improving verification results compared to current state-of-the-art verifiers.
Syllabus
[POPL'24] ReLU Hull Approximation
Taught by
ACM SIGPLAN
Related Courses
Approximation Algorithms Part IÉcole normale supérieure via Coursera Approximation Algorithms Part II
École normale supérieure via Coursera Shortest Paths Revisited, NP-Complete Problems and What To Do About Them
Stanford University via Coursera Algorithm Design and Analysis
University of Pennsylvania via edX Delivery Problem
University of California, San Diego via Coursera