Serverless for ML Inference on Kubernetes - Panacea or Folly
Offered By: CNCF [Cloud Native Computing Foundation] via YouTube
Course Description
Overview
Explore the advantages and challenges of serverless computing for machine learning inference on Kubernetes in this insightful conference talk. Delve into the results of extensive benchmarking experiments comparing serverless and traditional computing for inference workloads running on Kubernetes, using KubeFlow and the ModelDB MLOps Toolkit. Gain valuable insights into various model types, data modalities, hardware configurations, and workloads. Learn how to architect your own Kubernetes-based ML inference system and understand the trade-offs between flexibility, operating costs, and performance. Discover whether serverless computing is truly a panacea for elastic compute in ML inference or if its limitations outweigh its benefits.
Syllabus
Introduction
What is Serverless
ML Serving Considerations
Benchmark
Usability
Cost
Summary
Taught by
CNCF [Cloud Native Computing Foundation]
Related Courses
Introduction to Cloud Infrastructure TechnologiesLinux Foundation via edX Cloud Computing
Indian Institute of Technology, Kharagpur via Swayam Elastic Cloud Infrastructure: Containers and Services en Español
Google Cloud via Coursera Kyma – A Flexible Way to Connect and Extend Applications
SAP Learning Modernize Infrastructure and Applications with Google Cloud
Google Cloud via Coursera