YoVDO

Cloud Native Sustainable LLM Inference in Action

Offered By: CNCF [Cloud Native Computing Foundation] via YouTube

Tags

Cloud Computing Courses Sustainability Courses Kubernetes Courses Energy Efficiency Courses Kepler Courses vLLM Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore sustainable Large Language Models (LLM) inference using cloud-native technologies in this comprehensive tutorial. Delve into LLMs, energy consumption, and Kepler's role in monitoring power during LLM workloads. Discover how to balance environmental sustainability with technological efficiency by leveraging AI accelerator frequency adjustments in Cloud Native tech for optimized LLM inference. Witness a live demonstration of vLLM, an advanced inference framework, and observe the fine-tuning of AI accelerator settings in a Kubernetes cluster to achieve an ideal power-computation balance. Gain valuable insights into the future of eco-friendly cloud computing, whether you're a developer, IT specialist, or sustainability advocate. Position yourself at the forefront of this significant technological evolution and learn how to integrate environmental sustainability with cloud-native technology solutions.

Syllabus

Tutorial: Cloud Native Sustainable LLM Inference in Action


Taught by

CNCF [Cloud Native Computing Foundation]

Related Courses

Frontiers and Controversies in Astrophysics - Introduction to Exoplanets, Black Holes, and Cosmology
Yale University via YouTube
Monitoring Energy Consumption and Building Energy Efficient Systems the Cloud Native Way
Linux Foundation via YouTube
Evaluating the Energy Footprint of GitOps Architectures - A Benchmark Analysis
CNCF [Cloud Native Computing Foundation] via YouTube
Sustainability Through Accountability in a CNCF Ecosystem
CNCF [Cloud Native Computing Foundation] via YouTube
Energy Efficient Placement of Edge Workloads
CNCF [Cloud Native Computing Foundation] via YouTube