YoVDO

Building an ML Inference Platform with Knative - Serverless Containers on Kubernetes

Offered By: CNCF [Cloud Native Computing Foundation] via YouTube

Tags

Knative Courses Machine Learning Courses Kubernetes Courses Apache Kafka Courses Serverless Computing Courses Containerization Courses KServe Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the development of a machine learning inference platform using Knative in this informative conference talk. Learn how Bloomberg LP and IBM leveraged Knative's serverless capabilities to simplify and accelerate ML-driven application deployment and scaling in production environments. Discover the advantages of Knative for running serverless containers on Kubernetes, including automated networking, volume-based autoscaling, and revision tracking. Gain insights into the evolution of the KServe project and how Knative enables blue/green/canary rollout strategies for safe ML model updates. Understand how to improve GPU utilization with scale-to-zero functionality and build Apache Kafka events-based inference pipelines. Examine testing benchmarks comparing Knative to Kubernetes HPA and learn performance optimization tips for running numerous Knative services in a single cluster.

Syllabus

How We Built an ML inference Platform with Knative - Dan Sun, Bloomberg LP & Animesh Singh, IBM


Taught by

CNCF [Cloud Native Computing Foundation]

Related Courses

Deploying Apache Pulsar to Google Kubernetes Engine
Pluralsight
Stream Processing Design Patterns with Kafka Streams
LinkedIn Learning
Apache Kafka Series - Confluent Schema Registry & REST Proxy
Udemy
Apache Kafka Series - Kafka Connect Hands-on Learning
Udemy
The Complete Apache Kafka Practical Guide
Udemy