YoVDO

Fast Distributed Optimization with Asynchrony and Time Delays - Lecture 3

Offered By: International Centre for Theoretical Sciences via YouTube

Tags

Machine Learning Courses Parallel Computing Courses Stochastic Gradient Descent Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intricacies of fast distributed optimization in the presence of asynchrony and time delays in this lecture by Laurent Massoulié. Delve into advanced concepts as part of the "Data Science: Probabilistic and Optimization Methods" discussion meeting organized by the International Centre for Theoretical Sciences. Gain insights into cutting-edge techniques for processing large-scale data efficiently, considering the challenges posed by asynchronous operations and time lags in distributed systems. Examine how these methods contribute to the evolving landscape of data science, bridging the gap between theoretical foundations and practical applications. Benefit from the expertise of renowned researchers and engage with a diverse community of participants in this hour-long session, which forms part of a comprehensive five-day program featuring discussions on probabilistic and optimization techniques in data science.

Syllabus

Fast Distributed Optimization with Asynchrony and Time Delays (Lecture 3) by Laurent Massoulié


Taught by

International Centre for Theoretical Sciences

Related Courses

Building Classification Models with scikit-learn
Pluralsight
Practical Deep Learning for Coders - Full Course
freeCodeCamp
Neural Networks Made Easy
Udemy
Intro to Deep Learning
Kaggle
Stochastic Gradient Descent
Great Learning via YouTube