Trillion Parameter Models Are Here
Offered By: Edan Meyer via YouTube
Course Description
Overview
Explore the groundbreaking advancements in training large-scale Machine Learning models through Microsoft's ZeRO-Infinity technology in this 27-minute video. Learn how this innovation overcomes previous GPU memory limitations, enabling the training of models with trillions of parameters using modest GPU resources and fine-tuning billion-parameter models on a single GPU. Discover the implications for working with extensive models like GPT-2 and understand the technical aspects of ZeRO-Infinity, including its forward step and parallelization techniques. Delve into the results and potential applications of this technology, which promises to revolutionize deep learning training by unlocking unprecedented model scale.
Syllabus
Intro
Motivation
Paper
Forward Step
Parallelization
Results
Taught by
Edan Meyer
Related Courses
4.0 Shades of Digitalisation for the Chemical and Process IndustriesUniversity of Padova via FutureLearn A Day in the Life of a Data Engineer
Amazon Web Services via AWS Skill Builder FinTech for Finance and Business Leaders
ACCA via edX Accounting Data Analytics
University of Illinois at Urbana-Champaign via Coursera Accounting Data Analytics
Coursera