YoVDO

Recent Advances in EAGO.jl and its Use With JuMP.jl

Offered By: The Julia Programming Language via YouTube

Tags

Julia Courses Automatic Differentiation Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the recent advancements in EAGO.jl, an open-source deterministic global optimizer for mixed-integer nonlinear programs (MINLPs), and its integration with JuMP.jl in this 11-minute video presentation. Discover how EAGO leverages Julia's multiple dispatch and speed to solve complex optimization problems. Learn about EAGO's use of McCormick-based convex and concave relaxation theory for rigorous global bounding, its support for user-defined functions, and its ability to perform symbolic transformations on directed acyclic graphs. Understand the benefits and challenges of developing EAGO alongside JuMP, including improved user experience in setting up optimization models and the need for internal interpretation of JuMP model information. Gain insights into EAGO's flexibility as a research platform for solving non-standard optimization problems in engineering applications, and its dependence on JuMP and MathOptInterface functionality.

Syllabus

Recent Advances in EAGO.jl and its Use With JuMP.jl


Taught by

The Julia Programming Language

Related Courses

Introduction to Neural Networks and PyTorch
IBM via Coursera
Regression with Automatic Differentiation in TensorFlow
Coursera Project Network via Coursera
Neural Network from Scratch in TensorFlow
Coursera Project Network via Coursera
Customising your models with TensorFlow 2
Imperial College London via Coursera
PyTorch Fundamentals
Microsoft via Microsoft Learn