YoVDO

Trillion Parameter Models Are Here

Offered By: Edan Meyer via YouTube

Tags

Machine Learning Courses GPT-2 Courses Parallelization Courses Model Training Courses

Course Description

Overview

Explore the groundbreaking advancements in training large-scale Machine Learning models through Microsoft's ZeRO-Infinity technology in this 27-minute video. Learn how this innovation overcomes previous GPU memory limitations, enabling the training of models with trillions of parameters using modest GPU resources and fine-tuning billion-parameter models on a single GPU. Discover the implications for working with extensive models like GPT-2 and understand the technical aspects of ZeRO-Infinity, including its forward step and parallelization techniques. Delve into the results and potential applications of this technology, which promises to revolutionize deep learning training by unlocking unprecedented model scale.

Syllabus

Intro
Motivation
Paper
Forward Step
Parallelization
Results


Taught by

Edan Meyer

Related Courses

How Google does Machine Learning en EspaƱol
Google Cloud via Coursera
Creating Custom Callbacks in Keras
Coursera Project Network via Coursera
Automatic Machine Learning with H2O AutoML and Python
Coursera Project Network via Coursera
AI in Healthcare Capstone
Stanford University via Coursera
AutoML con Pycaret y TPOT
Coursera Project Network via Coursera