Recent Advances in Optimization Solvers within JuliaSmoothOptimizers
Offered By: The Julia Programming Language via YouTube
Course Description
Overview
Explore the latest advancements in continuous nonlinear nonconvex optimization solvers within the JuliaSmoothOptimizers (JSO) organization in this 17-minute presentation. Gain insights into why these solvers are suitable for large-scale optimization problems and learn about new packages like AdaptiveRegularization.jl. Discover how research-level solvers leverage high-level linear algebra packages, utilize flexible modeling strategies with Automatic Differentiation (AD) or tools like JuMP, and employ focused implementations. Understand the benefits of in-place solvers, multiprecision solves, and GPU computations. Examine the application of factorization-free solvers in addressing large-scale optimization problems, such as those arising from discretized PDE-constrained optimization. Learn about the JSO ecosystem's organization across various packages and how JSOSuite.jl serves as a comprehensive entry point for first-time users, simplifying benchmarking and introducing automatic algorithm selection. Explore the longevity and widespread adoption of JuliaSmoothOptimizers since 2015, with over 50 registered packages contributing to linear/nonlinear optimization and linear algebra domains across diverse ecosystems.
Syllabus
Recent Advances in Optimization Solvers within JuliaSmoothOptimizers
Taught by
The Julia Programming Language
Related Courses
Introduction to Neural Networks and PyTorchIBM via Coursera Regression with Automatic Differentiation in TensorFlow
Coursera Project Network via Coursera Neural Network from Scratch in TensorFlow
Coursera Project Network via Coursera Customising your models with TensorFlow 2
Imperial College London via Coursera PyTorch Fundamentals
Microsoft via Microsoft Learn