YoVDO

Mastering LLM Delivery in Private Clouds: A Journey to Seamless Deployments with Kubernetes and OCI

Offered By: CNCF [Cloud Native Computing Foundation] via YouTube

Tags

Kubernetes Courses MLOps Courses Data Governance Courses Model Deployment Courses Artifact Management Courses OCI Artifacts Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a case study on simplifying private Large Language Model (LLM) deployments using cloud native technologies, specifically Kubernetes and OCI artifacts. Discover how these tools address data governance and security challenges while enabling efficient sharing of large artifacts between model developers and consumers. Learn about the benefits of Kubernetes in delivering a highly portable, cloud-native inference stack, and understand how OCI Artifacts can be leveraged to achieve significant efficiency gains by reducing duplicate storage, increasing download speed, and minimizing governance overhead. Gain valuable insights into incorporating Kubernetes and OCI into your MLOps journey for seamless LLM delivery in private cloud environments.

Syllabus

Mastering LLM Delivery in Private Clouds: A Journey to Seamless Dep... Autumn Moulder & Marwan Ahmed


Taught by

CNCF [Cloud Native Computing Foundation]

Related Courses

From Monolith to Microservices
Docker via YouTube
AWS: CI/CD Pipelines and Deployment Strategies
Whizlabs via Coursera
Securing the Build and Deployment Pipeline - Challenges and Best Practices
OWASP Foundation via YouTube
High-Security, Zero-Connectivity and Air-Gapped Clouds - Delivering Complex Software with OCM and Flux
CNCF [Cloud Native Computing Foundation] via YouTube
Managing Artifacts at Scale for CI and Data Processing
CNCF [Cloud Native Computing Foundation] via YouTube