YoVDO

Maximum Consensus Floating Point Solutions for Infeasible Low-Dimensional Linear Programs - Lecture

Offered By: ACM SIGPLAN via YouTube

Tags

Linear Programming Courses Computer Science Courses Numerical Analysis Courses Computational Geometry Courses Algorithm Optimization Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a novel method for efficiently solving infeasible low-dimensional linear programs (LDLPs) with billions of constraints and a small number of unknown variables in this 18-minute conference talk from PLDI 2024. Delve into the innovative approach presented by Mridul Aanjaneya and Santosh Nagarakatte from Rutgers University, which focuses on generating floating point solutions that satisfy the maximum number of constraints in the context of the RLibm project for creating correctly rounded math libraries. Learn how the researchers leverage the geometric duality between linear programs and convex hulls, using the convex hull as an intermediate representation to split constraints into feasible and infeasible subsets. Discover the key idea of identifying a superset of infeasible constraints by computing the convex hull in 2-dimensions, and how this approach leads to improved performance of RLibm polynomials while significantly reducing solving time for corresponding linear programs.

Syllabus

[PLDI24] Maximum Consensus Floating Point Solutions for Infeasible Low-Dimensional Linear(…)


Taught by

ACM SIGPLAN

Related Courses

Linear and Discrete Optimization
École Polytechnique Fédérale de Lausanne via Coursera
Linear and Integer Programming
University of Colorado Boulder via Coursera
Graph Partitioning and Expanders
Stanford University via NovoEd
Discrete Inference and Learning in Artificial Vision
École Centrale Paris via Coursera
Convex Optimization
Stanford University via edX