Simplified Training, Tuning and Deployment of Foundational Models as a Service with Red Hat OpenShift
Offered By: Anyscale via YouTube
Course Description
Overview
Explore the simplified end-to-end lifecycle of foundation models in this 28-minute conference talk from Anyscale. Discover how a cloud-native, scalable stack for training, fine-tuning, and inferencing is realized with Red Hat OpenShift Data Science (RHODS). Learn about new open-source components like CodeFlare SDK, Multi-cloud App Dispatcher, and InstaScale, and their integration with Ray and PyTorch for large-scale data preparation, training, and validation. Examine the inference stack based on ModelMesh and KServe for deploying models in production, and understand how Data Science Pipelines orchestrate models, including versioning and tracking. Gain insights into operating this stack across public cloud and on-premise environments, and explore success stories highlighting the benefits of this full-stack approach for various use cases involving foundation models.
Syllabus
Simplified Training, Tuning and Deployment of Foundational Models as a Service with Red Hat OpenShif
Taught by
Anyscale
Related Courses
TensorFlow: Working with NLPLinkedIn Learning Introduction to Video Editing - Video Editing Tutorials
Great Learning via YouTube HuggingFace Crash Course - Sentiment Analysis, Model Hub, Fine Tuning
Python Engineer via YouTube GPT3 and Finetuning the Core Objective Functions - A Deep Dive
David Shapiro ~ AI via YouTube How to Build a Q&A AI in Python - Open-Domain Question-Answering
James Briggs via YouTube