YoVDO

Cloud Native Sustainable LLM Inference in Action

Offered By: CNCF [Cloud Native Computing Foundation] via YouTube

Tags

Cloud Computing Courses Sustainability Courses Kubernetes Courses Energy Efficiency Courses Kepler Courses vLLM Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore sustainable Large Language Models (LLM) inference using cloud-native technologies in this comprehensive tutorial. Delve into LLMs, energy consumption, and Kepler's role in monitoring power during LLM workloads. Discover how to balance environmental sustainability with technological efficiency by leveraging AI accelerator frequency adjustments in Cloud Native tech for optimized LLM inference. Witness a live demonstration of vLLM, an advanced inference framework, and observe the fine-tuning of AI accelerator settings in a Kubernetes cluster to achieve an ideal power-computation balance. Gain valuable insights into the future of eco-friendly cloud computing, whether you're a developer, IT specialist, or sustainability advocate. Position yourself at the forefront of this significant technological evolution and learn how to integrate environmental sustainability with cloud-native technology solutions.

Syllabus

Tutorial: Cloud Native Sustainable LLM Inference in Action


Taught by

CNCF [Cloud Native Computing Foundation]

Related Courses

Software as a Service
University of California, Berkeley via Coursera
Software Defined Networking
Georgia Institute of Technology via Coursera
Pattern-Oriented Software Architectures: Programming Mobile Services for Android Handheld Systems
Vanderbilt University via Coursera
Web-Technologien
openHPI
Données et services numériques, dans le nuage et ailleurs
Certificat informatique et internet via France Université Numerique