YoVDO

AI Deployment: Mastering LLMs with KFServing in Kubernetes

Offered By: CNCF [Cloud Native Computing Foundation] via YouTube

Tags

Kubernetes Courses Scalability Courses Cloud Native Computing Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intricacies of deploying Large Language Models (LLMs) in Kubernetes using KFServing in this informative 14-minute conference talk. Delve into the seamless integration of LLMs within cloud-native ecosystems, harnessing Kubernetes' scalability and KFServing's model serving capabilities. Learn best practices for deploying, managing, and optimizing LLMs in a Kubernetes environment, ensuring efficient resource utilization and high-performance inference. Gain valuable insights from Irvi Firqotul Aini of Mercari as she shares expertise on elevating AI deployment strategies in the rapidly evolving field of artificial intelligence. Perfect for AI practitioners and cloud engineers seeking to enhance their knowledge of cutting-edge LLM deployment techniques.

Syllabus

AI Deployment: Mastering LLMs with KFServing in Kubernetes - Irvi Firqotul Aini, Mercari


Taught by

CNCF [Cloud Native Computing Foundation]

Related Courses

Kubernetes: Cloud Native Ecosystem
LinkedIn Learning
Kubernetes: Cloud Native Ecosystem
LinkedIn Learning
Cloud Native Certified Kubernetes Administrator (CKA) (Legacy)
A Cloud Guru
Implement Resiliency in a Cloud-Native ASP.NET Core Microservice
Microsoft via YouTube
Open Networking & Edge Executive Forum 2021 - Day 1 Part 2 Sessions
Linux Foundation via YouTube