YoVDO

ReLU Hull Approximation - Fast and Precise Convex Hull Over-Approximation for Neural Network Verification

Offered By: ACM SIGPLAN via YouTube

Tags

Neural Networks Courses Linear Programming Courses Approximation Algorithms Courses Polytopes Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a novel approach to over-approximating the convex hull of the ReLU function in neural network verification. Delve into the innovative WraLU method, which constructs a convex polytope to efficiently "wrap" the ReLU hull by reusing linear pieces and creating adjacent upper faces. Learn how this technique outperforms existing methods in precision, efficiency, and constraint complexity while addressing arbitrary input polytopes and higher-dimensional cases. Discover the impact of integrating WraLU into PRIMA, a state-of-the-art neural network verifier, and its effectiveness in verifying large-scale ReLU-based neural networks. Gain insights into how this approach significantly reduces the number of constraints for linear programming solvers while maintaining or improving verification results compared to current state-of-the-art verifiers.

Syllabus

[POPL'24] ReLU Hull Approximation


Taught by

ACM SIGPLAN

Related Courses

Spacetime and Quantum Mechanics; Particles and Strings; Polytopes, Binary Geometries and Quiver Categories - Nima Arkani-Hamed
Institute for Advanced Study via YouTube
Proving Analytic Inequalities
Joint Mathematics Meetings via YouTube
Algebraic Structures on Polytopes
Joint Mathematics Meetings via YouTube
José Samper Seminar - Higher Chordality
Applied Algebraic Topology Network via YouTube
Cynthia Vinzant - Log Concave Polynomials and Matroids
Hausdorff Center for Mathematics via YouTube