YoVDO

A Stochastic Newton Algorithm for Distributed Convex Optimization

Offered By: Simons Institute via YouTube

Tags

Convex Optimization Courses Algorithm Design Courses Distributed Computing Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 35-minute lecture on a novel stochastic Newton algorithm for distributed convex optimization presented by Brian Bullins from Purdue University at the Simons Institute. Delve into the proposed method for homogeneous distributed stochastic convex optimization, where machines calculate stochastic gradients and Hessian-vector products of the same population objective. Learn how this algorithm reduces communication rounds without compromising performance, particularly for quasi-self-concordant objectives like logistic regression. Examine the convergence guarantees and empirical evidence supporting the effectiveness of this approach in optimization and algorithm design.

Syllabus

A Stochastic Newton Algorithm for Distributed Convex Optimization


Taught by

Simons Institute

Related Courses

Cloud Computing Concepts, Part 1
University of Illinois at Urbana-Champaign via Coursera
Cloud Computing Concepts: Part 2
University of Illinois at Urbana-Champaign via Coursera
Reliable Distributed Algorithms - Part 1
KTH Royal Institute of Technology via edX
Introduction to Apache Spark and AWS
University of London International Programmes via Coursera
Réalisez des calculs distribués sur des données massives
CentraleSupélec via OpenClassrooms