YoVDO

Simplified Training, Tuning and Deployment of Foundational Models as a Service with Red Hat OpenShift

Offered By: Anyscale via YouTube

Tags

Foundation Models Courses Distributed Training Courses Cloud Native Computing Courses Fine-Tuning Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the simplified end-to-end lifecycle of foundation models in this 28-minute conference talk from Anyscale. Discover how a cloud-native, scalable stack for training, fine-tuning, and inferencing is realized with Red Hat OpenShift Data Science (RHODS). Learn about new open-source components like CodeFlare SDK, Multi-cloud App Dispatcher, and InstaScale, and their integration with Ray and PyTorch for large-scale data preparation, training, and validation. Examine the inference stack based on ModelMesh and KServe for deploying models in production, and understand how Data Science Pipelines orchestrate models, including versioning and tracking. Gain insights into operating this stack across public cloud and on-premise environments, and explore success stories highlighting the benefits of this full-stack approach for various use cases involving foundation models.

Syllabus

Simplified Training, Tuning and Deployment of Foundational Models as a Service with Red Hat OpenShif


Taught by

Anyscale

Related Courses

Custom and Distributed Training with TensorFlow
DeepLearning.AI via Coursera
Architecting Production-ready ML Models Using Google Cloud ML Engine
Pluralsight
Building End-to-end Machine Learning Workflows with Kubeflow
Pluralsight
Deploying PyTorch Models in Production: PyTorch Playbook
Pluralsight
Inside TensorFlow
TensorFlow via YouTube