Self-Hosted LLM Agent on Your Own Laptop or Edge Device
Offered By: CNCF [Cloud Native Computing Foundation] via YouTube
Course Description
Overview
Explore the evolution of LLM applications and learn how to build a self-hosted AI agent service using open-source tools in this 35-minute conference talk by Michael Yuan from Second State. Discover the advantages of running open-source LLMs and agents on personal or private devices, including enhanced privacy, customization options, cost control, and value alignment. Gain insights into the narrowing gap between open-source and proprietary LLMs, with examples of open-source models outperforming SaaS-based alternatives. Learn about the benefits of open-source LLMs for AI agents, including cost-effectiveness, privacy, and the ability to customize through fine-tuning and RAG prompt engineering using private data. Follow a step-by-step demonstration on building a complete AI agent service using an open-source LLM and a personal knowledge base. Understand the implementation of the WasmEdge + Rust stack for fast and lightweight LLM inference, which offers cross-platform compatibility and native performance across various operating systems, CPUs, and GPUs.
Syllabus
Self-Hosted LLM Agent on Your Own Laptop or Edge Device | 在自己的笔记本电脑或边缘设备上自托管LLM Agent - Michael Yuan
Taught by
CNCF [Cloud Native Computing Foundation]
Related Courses
Fog Networks and the Internet of ThingsPrinceton University via Coursera AWS IoT: Developing and Deploying an Internet of Things
Amazon Web Services via edX Business Considerations for 5G with Edge, IoT, and AI
Linux Foundation via edX 5G Strategy for Business Leaders
Linux Foundation via edX Intel® Edge AI Fundamentals with OpenVINO™
Intel via Udacity