YoVDO

Gradient and Hessian Approximations for Model-based Blackbox Optimization

Offered By: GERAD Research Center via YouTube

Tags

Numerical Methods Courses Newton's Method Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore gradient and Hessian approximations for model-based blackbox optimization in this 48-minute seminar from GERAD Research Center. Delve into the mathematical theory behind optimizing functions that provide output without explanation. Examine classical and novel approximation techniques for blackbox functions, and see their application in a Medical Physics case study. Learn about solid state tank design optimization, Order-N accuracy, Newton's Method, and various gradient models. Investigate generalized simplex gradients, pseudo inverses, error bounds, and centered simplex gradients. Discover adjusted gradient techniques, simplex Hessians, and potential future research directions in this comprehensive talk by Warren Hare from the University of British Columbia.

Syllabus

Gradient and Hessian Approximations for Model-based Blackbox Optimization
Solid state tank design
Optimizing the design
Order-N accuracy at x
Newton's Method
Proof
Models from gradients
A cleaner approach
Generalizing the Simplex Gradient
Pseudo inverses
Generalized Simplex Gradient error bound
Centred Simplex Gradients
Adjusted generalized centred simplex gradient
Adjusted Centred Simplex Gradient
A simpler approach
Generalized Simplex Hessian
Summary
Open directions


Taught by

GERAD Research Center

Related Courses

Introduction to Statistics: Descriptive Statistics
University of California, Berkeley via edX
Mathematical Methods for Quantitative Finance
University of Washington via Coursera
Dynamics
Massachusetts Institute of Technology via edX
Practical Numerical Methods with Python
George Washington University via Independent
統計学Ⅰ:データ分析の基礎 (ga014)
University of Tokyo via gacco