Turing-NLG, DeepSpeed and the ZeRO Optimizer
Offered By: Yannic Kilcher via YouTube
Course Description
Overview
Explore Microsoft's groundbreaking 17-billion parameter language model and the innovative ZeRO optimizer in this informative video. Dive into the technical details of how ZeRO enables efficient model and data parallelism without sacrificing training speed. Learn about the Turing-NLG model's state-of-the-art perplexity achievements and the DeepSpeed framework that powers it. Gain insights into the latest advancements in large-scale language model training and optimization techniques that are pushing the boundaries of natural language processing.
Syllabus
Turing-NLG, DeepSpeed and the ZeRO optimizer
Taught by
Yannic Kilcher
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Artificial Intelligence for Robotics
Stanford University via Udacity Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent