YoVDO

Scaling Distributed Machine Learning with the Parameter Server

Offered By: USENIX via YouTube

Tags

OSDI (Operating Systems Design and Implementation) Courses Consistency Models Courses Distributed Machine Learning Courses

Course Description

Overview

Explore a conference talk from OSDI '14 that introduces a parameter server framework for distributed machine learning. Learn about the framework's ability to manage asynchronous data communication between nodes, support flexible consistency models, elastic scalability, and continuous fault tolerance. Discover how this approach distributes both data and workloads over worker nodes while maintaining globally shared parameters on server nodes. Examine experimental results demonstrating the framework's scalability on petabytes of real data with billions of examples and parameters, covering problems from Sparse Logistic Regression to Latent Dirichlet Allocation and Distributed Sketching.

Syllabus

OSDI '14 - Scaling Distributed Machine Learning with the Parameter Server


Taught by

USENIX

Related Courses

Reliable Distributed Algorithms - Part 1
KTH Royal Institute of Technology via edX
Developing with Amazon DynamoDB (Italian)
Amazon Web Services via AWS Skill Builder
Managing Consistency, Capacity, and Performance in DynamoDB (German)
Amazon Web Services via AWS Skill Builder
Managing Consistency, Capacity, and Performance in DynamoDB (Italian)
Amazon Web Services via AWS Skill Builder
Managing Consistency, Capacity, and Performance in DynamoDB (Portuguese)
Amazon Web Services via AWS Skill Builder