YoVDO

Are Neural Networks Optimal Approximation Algorithms for Constraint Satisfaction Problems?

Offered By: USC Probability and Statistics Seminar via YouTube

Tags

Neural Networks Courses Computational Complexity Courses Combinatorial Optimization Courses Semidefinite Programming Courses Constraint Satisfaction Problems Courses Optimization Algorithms Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the capabilities of neural networks in solving NP-hard optimization problems, particularly constraint satisfaction problems, in this 40-minute talk from the USC Probability and Statistics Seminar. Delve into the OptGNN graph neural network architecture and its ability to capture optimal approximation algorithms for constraint satisfaction. Discover how OptGNN functions as a convex program solver, providing bounds on combinatorial problem optimality. Examine the competitive performance of OptGNN against state-of-the-art unsupervised neural baselines in neural combinatorial optimization benchmarks. Gain insights into the connections between neural networks and computation, and consider potential avenues for future research in this field.

Syllabus

Morris Yau: Are Neural Networks Optimal Approximation Algorithms (MIT)


Taught by

USC Probability and Statistics Seminar

Related Courses

Neural Networks for Machine Learning
University of Toronto via Coursera
Good Brain, Bad Brain: Basics
University of Birmingham via FutureLearn
Statistical Learning with R
Stanford University via edX
Machine Learning 1—Supervised Learning
Brown University via Udacity
Fundamentals of Neuroscience, Part 2: Neurons and Networks
Harvard University via edX