ReLU Hull Approximation - Fast and Precise Convex Hull Over-Approximation for Neural Network Verification
Offered By: ACM SIGPLAN via YouTube
Course Description
Overview
Explore a novel approach to over-approximating the convex hull of the ReLU function in neural network verification. Delve into the innovative WraLU method, which constructs a convex polytope to efficiently "wrap" the ReLU hull by reusing linear pieces and creating adjacent upper faces. Learn how this technique outperforms existing methods in precision, efficiency, and constraint complexity while addressing arbitrary input polytopes and higher-dimensional cases. Discover the impact of integrating WraLU into PRIMA, a state-of-the-art neural network verifier, and its effectiveness in verifying large-scale ReLU-based neural networks. Gain insights into how this approach significantly reduces the number of constraints for linear programming solvers while maintaining or improving verification results compared to current state-of-the-art verifiers.
Syllabus
[POPL'24] ReLU Hull Approximation
Taught by
ACM SIGPLAN
Related Courses
Neural Networks for Machine LearningUniversity of Toronto via Coursera Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn Statistical Learning with R
Stanford University via edX Machine Learning 1—Supervised Learning
Brown University via Udacity Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX