YoVDO

Multi-Precision Optimization Algorithm: Decreasing Computational Cost and Controlling Computational Error

Offered By: GERAD Research Center via YouTube

Tags

Optimization Algorithms Courses Gradient Descent Courses Numerical Analysis Courses Deep Neural Networks Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 56-minute seminar from the "Meet a GERAD researcher!" series focusing on multi-precision optimization algorithms for minimizing smooth, non-convex functions. Delve into the Quadratic Regularization (R2) algorithm, a gradient descent method with adaptive step size, and its extension into a Multi-Precision (MPR2) version. Learn how MPR2 dynamically adapts precision levels to reduce computational effort while maintaining convergence to a minimum. Discover the challenges of variable precision computing and how MPR2 addresses them. Examine the algorithm's performance through various problem examples, with particular emphasis on applications like deep neural network training where computational cost reduction is crucial.

Syllabus

Multi-Precision Optimization Alg.: Decreasing Computational Cost and Controlling Computational Error


Taught by

GERAD Research Center

Related Courses

Deep Learning for Natural Language Processing
University of Oxford via Independent
Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization
DeepLearning.AI via Coursera
Deep Learning Part 1 (IITM)
Indian Institute of Technology Madras via Swayam
Deep Learning - Part 1
Indian Institute of Technology, Ropar via Swayam
Logistic Regression with Python and Numpy
Coursera Project Network via Coursera