Self-hosted LLMs Across All Your Devices and GPUs
Offered By: Conf42 via YouTube
Course Description
Overview
Explore the world of self-hosted Large Language Models (LLMs) across various devices and GPUs in this informative conference talk from Conf42 LLMs 2024. Discover the benefits of open-source solutions and learn why alternatives to OpenAI might be preferable. Gain insights into the LlamaEdge API server and its role as a developer platform. Watch live demonstrations showcasing the implementation and operation of self-hosted LLMs. Delve into development and operational aspects, and witness practical applications through additional demos. Enhance your understanding of LLM deployment and management across different hardware configurations in this comprehensive 34-minute presentation.
Syllabus
intro
preamble
all open source
demo
why not just openai?
why llamaedge api server?
llamaedfe is a developer platform
demo
dev
ops
demo
thank you
Taught by
Conf42
Related Courses
Introduction to Artificial IntelligenceStanford University via Udacity Natural Language Processing
Columbia University via Coursera Probabilistic Graphical Models 1: Representation
Stanford University via Coursera Computer Vision: The Fundamentals
University of California, Berkeley via Coursera Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent