Mastering LLM Delivery in Private Clouds: A Journey to Seamless Deployments with Kubernetes and OCI
Offered By: CNCF [Cloud Native Computing Foundation] via YouTube
Course Description
Overview
Explore a case study on simplifying private Large Language Model (LLM) deployments using cloud native technologies, specifically Kubernetes and OCI artifacts. Discover how these tools address data governance and security challenges while enabling efficient sharing of large artifacts between model developers and consumers. Learn about the benefits of Kubernetes in delivering a highly portable, cloud-native inference stack, and understand how OCI Artifacts can be leveraged to achieve significant efficiency gains by reducing duplicate storage, increasing download speed, and minimizing governance overhead. Gain valuable insights into incorporating Kubernetes and OCI into your MLOps journey for seamless LLM delivery in private cloud environments.
Syllabus
Mastering LLM Delivery in Private Clouds: A Journey to Seamless Dep... Autumn Moulder & Marwan Ahmed
Taught by
CNCF [Cloud Native Computing Foundation]
Related Courses
Introduction to Cloud Infrastructure TechnologiesLinux Foundation via edX Scalable Microservices with Kubernetes
Google via Udacity Google Cloud Fundamentals: Core Infrastructure
Google via Coursera Introduction to Kubernetes
Linux Foundation via edX Fundamentals of Containers, Kubernetes, and Red Hat OpenShift
Red Hat via edX