Efficient Stable Diffusion Pre-Training on Billions of Images with Ray
Offered By: Databricks via YouTube
Course Description
Overview
Discover how to efficiently pre-train Stable Diffusion models on billions of images using Ray in this 30-minute conference talk by Databricks. Learn to overcome challenges in scaling data preprocessing, improving GPU utilization, ensuring fault tolerance, and managing heterogeneous clusters. Explore an end-to-end pre-training solution that achieves large-scale state-of-the-art performance using Ray Data and Ray Train. Gain insights on implementing a stable diffusion pre-training pipeline, enhancing efficiency in large-scale multimodal data processing, and scaling online preprocessing and distributed training across different GPU types to optimize utilization and reduce costs. Presented by Hao Chen and Yunxuan Xiao from Anyscale Inc., this talk offers valuable takeaways for maximizing performance and cost efficiency in Stable Diffusion pre-training at scale.
Syllabus
Efficient Stable Diffusion Pre-Training on Billions of Images with Ray
Taught by
Databricks
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Natural Language Processing
Columbia University via Coursera Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent