Scalable Training of Language Models Using Ray, JAX, and TPUv4
Offered By: Anyscale via YouTube
Course Description
Overview
Explore the challenges and design decisions associated with developing a scalable training framework for large language models in this 34-minute conference talk from Ray Summit 2022. Delve into the quantitative analysis of efficiency improvements resulting from adopting new software and hardware solutions, including Ray, JAX pjit, and TPUv4. Learn about the distributed training strategies required for modern large language models due to their size, and gain insights into the rapid developments on both software and hardware frontiers that address the challenges of efficient and robust training.
Syllabus
Scalable training of language models using Ray, JAX, and TPUv4 at Cohere
Taught by
Anyscale
Related Courses
JAX Crash Course - Accelerating Machine Learning CodeAssemblyAI via YouTube NFNets - High-Performance Large-Scale Image Recognition Without Normalization
Yannic Kilcher via YouTube Coding a Neural Network from Scratch in Pure JAX - Machine Learning with JAX - Tutorial 3
Aleksa Gordić - The AI Epiphany via YouTube Diffrax - Numerical Differential Equation Solvers in JAX
Fields Institute via YouTube JAX- Accelerated Machine Learning Research via Composable Function Transformations in Python
Fields Institute via YouTube