New Advances for Cross-Platform AI Applications in Docker
Offered By: CNCF [Cloud Native Computing Foundation] via YouTube
Course Description
Overview
Explore cutting-edge techniques for enhancing cross-platform GPU/AI workloads in container ecosystems, focusing on Docker's integration of the WebGPU standard in this 36-minute conference talk. Learn how the WebGPU standard enables containerized applications to access host GPUs and AI accelerators through a versatile API, eliminating the need for Docker images tailored to specific GPU vendors and proprietary drivers. Watch a demonstration of the WasmEdge project leveraging WebGPU to create portable LLM inference applications in Rust, and see how Docker seamlessly manages and orchestrates these applications. Gain insights into the future of cross-platform AI development and deployment using Docker, and understand the potential impact on streamlining AI workflows across different hardware configurations.
Syllabus
New Advances for Cross-Platform AI Applications in Docker | Docker中跨平台AI应用程序的新进展 - Michael Yuan
Taught by
CNCF [Cloud Native Computing Foundation]
Related Courses
Fundamentals of Accelerated Computing with CUDA C/C++Nvidia via Independent Using GPUs to Scale and Speed-up Deep Learning
IBM via edX Deep Learning
IBM via edX Deep Learning with IBM
IBM via edX Accelerating Deep Learning with GPUs
IBM via Cognitive Class