Wasm as the Runtime for LLMs - Advantages and Applications
Offered By: CNCF [Cloud Native Computing Foundation] via YouTube
Course Description
Overview
Explore the emerging trend of WebAssembly (Wasm) as a runtime for Large Language Models (LLMs) in this 28-minute conference talk by Michael Yuan from Second State. Learn why Python-based LLM applications are facing challenges due to performance, size, and complexity issues, and discover how compiled languages like C, C++, and Rust are gaining traction in LLM frameworks. Understand the benefits of using Wasm for LLM applications, including improved efficiency, safety, and performance with a smaller footprint. Witness demonstrations of running Llama 2 models in Wasm and developing LLM agents in Rust for Wasm environments. Gain insights into real-world use cases, such as LLM-based code review and book-based learning assistants, complete with live demonstrations.
Syllabus
Wasm Is Becoming the Runtime for LLMs - Michael Yuan, Second State
Taught by
CNCF [Cloud Native Computing Foundation]
Related Courses
Ollama: Libraries, Vision Models, and OpenAI Compatibility UpdatesSam Witteveen via YouTube Mistral Large with Function Calling - Review and Implementation
Sam Witteveen via YouTube Building a Custom Crew with CrewAI - Sequential vs Hierarchical Process Methods
Sam Witteveen via YouTube Llama 3, CrewAI, and Groq - Building an Email AI Agent
Sam Witteveen via YouTube Creating an AI Agent with LangGraph, Llama 3, and Groq - Advanced Tutorial
Sam Witteveen via YouTube