YoVDO

Exploring ML Model Serving with KServe - Features and Use Cases

Offered By: CNCF [Cloud Native Computing Foundation] via YouTube

Tags

Machine Learning Courses Kubernetes Courses KServe Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the world of machine learning model serving with KServe in this informative conference talk featuring fun drawings. Dive into the fundamentals of KServe, an easy-to-use platform built on Kubernetes for deploying ML models. Learn about its high abstraction interfaces, performant solutions for common infrastructure issues, and features like GPU scaling and ModelMesh serving. Discover how KServe simplifies model deployment for data scientists and engineers, allowing them to focus on building new models. Examine key components such as the Predictor, Control Plane, Data Plane, and Inference Graph. Understand KServe's standard inference protocol, data plane plugins, and monitoring capabilities. Explore use cases, multi-model serving, and the project's roadmap towards its v1.0 release. Gain insights into KServe's evolution since 2019 and its exciting new functionalities that address the needs of ML practitioners.

Syllabus

Introduction
Features
Predictor
Control Plane
Replicas
CoopTree
Data Plane
Inference Graph
Standard Inference Protocol
Data Plane Plugins
Monitor Logger
Serving Runtime
MultiModel Serving
Use Cases
Inference Service
Conclusion
Questions


Taught by

CNCF [Cloud Native Computing Foundation]

Related Courses

Serverless Machine Learning Model Inference on Kubernetes with KServe
Devoxx via YouTube
Machine Learning in Fastly's Compute@Edge
Linux Foundation via YouTube
ModelMesh: Scalable AI Model Serving on Kubernetes
Linux Foundation via YouTube
MLSecOps - Automated Online and Offline ML Model Evaluations on Kubernetes
Linux Foundation via YouTube
Creating a Custom Serving Runtime in KServe ModelMesh - Hands-On Experience
Linux Foundation via YouTube