Confidential Containers for GPU Compute - Incorporating LLMs in AI Lift-and-Shift Strategy
Offered By: CNCF [Cloud Native Computing Foundation] via YouTube
Course Description
Overview
Explore the evolution of confidential containers and their integration with GPU cloud-native stack for AI/ML workloads in this 32-minute conference talk. Delve into the transition from traditional to secure, isolated environments for sensitive data processing. Learn about the use of Kata for confidential container enablement, ensuring security while maintaining container flexibility. Examine a virtualization reference architecture supporting advanced scenarios like GPUdirect RDMA. Discover the lift-and-shift approach for seamless migration of existing AI/ML workloads to confidential environments. Understand how this integration combines LLMs with GPU-accelerated computing, leveraging Kubernetes for effective orchestration while balancing computational power and data privacy.
Syllabus
Confidential Containers for GPU Compute: Incorporating LLMs in a Lift-and-Shift Strategy for AI
Taught by
CNCF [Cloud Native Computing Foundation]
Related Courses
Advanced Operating SystemsGeorgia Institute of Technology via Udacity Cloud Computing Applications, Part 1: Cloud Systems and Infrastructure
University of Illinois at Urbana-Champaign via Coursera GT - Refresher - Advanced OS
Georgia Institute of Technology via Udacity Introduction to Cloud Infrastructure Technologies
Linux Foundation via edX Microsoft Windows Server 2012 Fundamentals: Hyper-V
Microsoft via edX