Managing MLOps at Scale in OpenShift/Kubernetes
Offered By: DevConf via YouTube
Course Description
Overview
          Explore a comprehensive conference talk on managing MLOps at scale in OpenShift and Kubernetes environments. Discover how data scientists and developers can efficiently productize AI/ML models using open-source projects like KServe, Codeflare, and OpenDataHub. Learn about cost-effective and agile approaches to accelerate AI/ML adoption without infrastructure concerns or public cloud vendor lock-in. Gain insights into OpenDataHub's capabilities for rapid MLOps adoption and deployment of integrated open-source and third-party tools for AI/ML modeling in a managed cloud service. Witness a practical demonstration of training, deploying, and operating AI/ML models using popular libraries and frameworks. Delivered by Roberto Carratalá at DevConf.CZ 2024, this 30-minute session provides valuable knowledge for organizations seeking to implement AI as a service and streamline their MLOps processes.
        
Syllabus
Managing MLOps at scale in OpenShift/Kubernetes - DevConf.CZ 2024
Taught by
DevConf
Related Courses
Serverless Machine Learning Model Inference on Kubernetes with KServeDevoxx via YouTube Machine Learning in Fastly's Compute@Edge
Linux Foundation via YouTube ModelMesh: Scalable AI Model Serving on Kubernetes
Linux Foundation via YouTube MLSecOps - Automated Online and Offline ML Model Evaluations on Kubernetes
Linux Foundation via YouTube Creating a Custom Serving Runtime in KServe ModelMesh - Hands-On Experience
Linux Foundation via YouTube
