YoVDO

Mastering LLM Delivery in Private Clouds: A Journey to Seamless Deployments with Kubernetes and OCI

Offered By: CNCF [Cloud Native Computing Foundation] via YouTube

Tags

Kubernetes Courses MLOps Courses Data Governance Courses Model Deployment Courses Artifact Management Courses OCI Artifacts Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a case study on simplifying private Large Language Model (LLM) deployments using cloud native technologies, specifically Kubernetes and OCI artifacts. Discover how these tools address data governance and security challenges while enabling efficient sharing of large artifacts between model developers and consumers. Learn about the benefits of Kubernetes in delivering a highly portable, cloud-native inference stack, and understand how OCI Artifacts can be leveraged to achieve significant efficiency gains by reducing duplicate storage, increasing download speed, and minimizing governance overhead. Gain valuable insights into incorporating Kubernetes and OCI into your MLOps journey for seamless LLM delivery in private cloud environments.

Syllabus

Mastering LLM Delivery in Private Clouds: A Journey to Seamless Dep... Autumn Moulder & Marwan Ahmed


Taught by

CNCF [Cloud Native Computing Foundation]

Related Courses

Developing a Tabular Data Model
Microsoft via edX
Data Science in Action - Building a Predictive Churn Model
SAP Learning
Serverless Machine Learning with Tensorflow on Google Cloud Platform 日本語版
Google Cloud via Coursera
Intro to TensorFlow em Português Brasileiro
Google Cloud via Coursera
Serverless Machine Learning con TensorFlow en GCP
Google Cloud via Coursera