YoVDO

Faster Neural Network Training with Data Echoing - Paper Explained

Offered By: Yannic Kilcher via YouTube

Tags

Machine Learning Courses Algorithms Courses

Course Description

Overview

Explore a detailed explanation of the "Data Echoing" technique, designed to optimize machine learning pipelines by addressing CPU bottlenecks. Learn how this method reuses data already in the pipeline to maximize GPU utilization and reduce idle time. Discover the impact of data echoing on various workloads, batch sizes, and echoing amounts, and understand how it can significantly decrease training time for models like ResNet-50 on ImageNet. Gain insights into the future of neural network training as accelerators continue to improve and earlier pipeline stages become potential bottlenecks.

Syllabus

Intro
Pipeline
Graphics
Claims
Models
Experiments
Final Experiments


Taught by

Yannic Kilcher

Related Courses

Information Theory
The Chinese University of Hong Kong via Coursera
Intro to Computer Science
University of Virginia via Udacity
Analytic Combinatorics, Part I
Princeton University via Coursera
Algorithms, Part I
Princeton University via Coursera
Divide and Conquer, Sorting and Searching, and Randomized Algorithms
Stanford University via Coursera