YoVDO

AI Deployment: Mastering LLMs with KFServing in Kubernetes

Offered By: CNCF [Cloud Native Computing Foundation] via YouTube

Tags

Kubernetes Courses Scalability Courses Cloud Native Computing Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intricacies of deploying Large Language Models (LLMs) in Kubernetes using KFServing in this informative 14-minute conference talk. Delve into the seamless integration of LLMs within cloud-native ecosystems, harnessing Kubernetes' scalability and KFServing's model serving capabilities. Learn best practices for deploying, managing, and optimizing LLMs in a Kubernetes environment, ensuring efficient resource utilization and high-performance inference. Gain valuable insights from Irvi Firqotul Aini of Mercari as she shares expertise on elevating AI deployment strategies in the rapidly evolving field of artificial intelligence. Perfect for AI practitioners and cloud engineers seeking to enhance their knowledge of cutting-edge LLM deployment techniques.

Syllabus

AI Deployment: Mastering LLMs with KFServing in Kubernetes - Irvi Firqotul Aini, Mercari


Taught by

CNCF [Cloud Native Computing Foundation]

Related Courses

Introduction to Cloud Infrastructure Technologies
Linux Foundation via edX
Scalable Microservices with Kubernetes
Google via Udacity
Google Cloud Fundamentals: Core Infrastructure
Google via Coursera
Introduction to Kubernetes
Linux Foundation via edX
Fundamentals of Containers, Kubernetes, and Red Hat OpenShift
Red Hat via edX