YoVDO

Are Neural Networks Optimal Approximation Algorithms for Constraint Satisfaction Problems?

Offered By: USC Probability and Statistics Seminar via YouTube

Tags

Neural Networks Courses Computational Complexity Courses Combinatorial Optimization Courses Semidefinite Programming Courses Constraint Satisfaction Problems Courses Optimization Algorithms Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the capabilities of neural networks in solving NP-hard optimization problems, particularly constraint satisfaction problems, in this 40-minute talk from the USC Probability and Statistics Seminar. Delve into the OptGNN graph neural network architecture and its ability to capture optimal approximation algorithms for constraint satisfaction. Discover how OptGNN functions as a convex program solver, providing bounds on combinatorial problem optimality. Examine the competitive performance of OptGNN against state-of-the-art unsupervised neural baselines in neural combinatorial optimization benchmarks. Gain insights into the connections between neural networks and computation, and consider potential avenues for future research in this field.

Syllabus

Morris Yau: Are Neural Networks Optimal Approximation Algorithms (MIT)


Taught by

USC Probability and Statistics Seminar

Related Courses

Graph Partitioning and Expanders
Stanford University via NovoEd
Convex Optimization
Stanford University via edX
Approximation Algorithms Part II
École normale supérieure via Coursera
The State of JuMP - Progress and Future Plans
The Julia Programming Language via YouTube
Quantum Algorithms for Optimization - Quantum Colloquium
Simons Institute via YouTube