Scalable Training of Language Models Using Ray, JAX, and TPUv4
Offered By: Anyscale via YouTube
Course Description
Overview
Explore the challenges and design decisions associated with developing a scalable training framework for large language models in this 34-minute conference talk from Ray Summit 2022. Delve into the quantitative analysis of efficiency improvements resulting from adopting new software and hardware solutions, including Ray, JAX pjit, and TPUv4. Learn about the distributed training strategies required for modern large language models due to their size, and gain insights into the rapid developments on both software and hardware frontiers that address the challenges of efficient and robust training.
Syllabus
Scalable training of language models using Ray, JAX, and TPUv4 at Cohere
Taught by
Anyscale
Related Courses
Custom and Distributed Training with TensorFlowDeepLearning.AI via Coursera Architecting Production-ready ML Models Using Google Cloud ML Engine
Pluralsight Building End-to-end Machine Learning Workflows with Kubeflow
Pluralsight Deploying PyTorch Models in Production: PyTorch Playbook
Pluralsight Inside TensorFlow
TensorFlow via YouTube