Work Assisting - Linking Task-Parallel Work Stealing with Data-Parallel Self Scheduling
Offered By: ACM SIGPLAN via YouTube
Course Description
Overview
Explore a novel scheduling strategy called work assisting in this 25-minute conference talk from ARRAY 2024. Discover how this approach combines data parallelism with task parallelism, allowing threads to share their current data-parallel activity in a shared array for assistance from other threads. Learn about the advantages of preserving data parallelism structure instead of implementing all parallelism as task parallelism, enabling the use of self-scheduling for data-parallel algorithms. Understand how this flexible scheduling algorithm adapts to different scenarios, potentially outperforming schedulers based purely on task parallelism. Examine benchmarks demonstrating the performance of this innovative approach in various problem domains.
Syllabus
[ARRAY24] Work Assisting: Linking Task-Parallel Work Stealing with Data-Parallel Self Scheduling
Taught by
ACM SIGPLAN
Related Courses
Intro to Parallel ProgrammingNvidia via Udacity Introduction to Linear Models and Matrix Algebra
Harvard University via edX Введение в параллельное программирование с использованием OpenMP и MPI
Tomsk State University via Coursera Supercomputing
Partnership for Advanced Computing in Europe via FutureLearn Fundamentals of Parallelism on Intel Architecture
Intel via Coursera