FATE-LLM - Empowering Large Language Models with Federated Learning
Offered By: CNCF [Cloud Native Computing Foundation] via YouTube
Course Description
Overview
Explore the intersection of large language models (LLMs) and federated learning in this 28-minute conference talk by Fangchi Wang and Layne Peng from VMware. Discover how FATE-LLM, an open-source federated learning platform, addresses challenges in LLM development such as data scarcity and privacy concerns. Learn about the integration of popular LLMs like ChatGLM and LLaMA into the federated learning paradigm, and understand the technical considerations for efficiency and security. Gain insights into KubeFATE, a cloud-native solution for managing FATE on Kubernetes, and its role in accelerating FATE-LLM workflows. Examine real-world experiments, evaluations, and the future roadmap of this innovative approach to empowering large language models.
Syllabus
FATE-LLM: Empowering Large Language Models with Federated Learning - Fangchi Wang & Layne Peng
Taught by
CNCF [Cloud Native Computing Foundation]
Related Courses
Introduction to Data Analytics for BusinessUniversity of Colorado Boulder via Coursera Digital and the Everyday: from codes to cloud
NPTEL via Swayam Systems and Application Security
(ISC)² via Coursera Protecting Health Data in the Modern Age: Getting to Grips with the GDPR
University of Groningen via FutureLearn Teaching Impacts of Technology: Data Collection, Use, and Privacy
University of California, San Diego via Coursera