Trillion Parameter Models Are Here
Offered By: Edan Meyer via YouTube
Course Description
Overview
Explore the groundbreaking advancements in training large-scale Machine Learning models through Microsoft's ZeRO-Infinity technology in this 27-minute video. Learn how this innovation overcomes previous GPU memory limitations, enabling the training of models with trillions of parameters using modest GPU resources and fine-tuning billion-parameter models on a single GPU. Discover the implications for working with extensive models like GPT-2 and understand the technical aspects of ZeRO-Infinity, including its forward step and parallelization techniques. Delve into the results and potential applications of this technology, which promises to revolutionize deep learning training by unlocking unprecedented model scale.
Syllabus
Intro
Motivation
Paper
Forward Step
Parallelization
Results
Taught by
Edan Meyer
Related Courses
Generating New Recipes using GPT-2Coursera Project Network via Coursera Deep Learning NLP: Training GPT-2 from scratch
Coursera Project Network via Coursera Artificial Creativity
Parsons School of Design via Coursera Coding Train Late Night - GPT-2, Hue Lights, Discord Bot
Coding Train via YouTube Coding Train Late Night - Fetch, GPT-2 and RunwayML
Coding Train via YouTube