Self-hosted LLMs Across All Your Devices and GPUs
Offered By: Conf42 via YouTube
Course Description
Overview
Explore the world of self-hosted Large Language Models (LLMs) across various devices and GPUs in this informative conference talk from Conf42 LLMs 2024. Discover the benefits of open-source solutions and learn why alternatives to OpenAI might be preferable. Gain insights into the LlamaEdge API server and its role as a developer platform. Watch live demonstrations showcasing the implementation and operation of self-hosted LLMs. Delve into development and operational aspects, and witness practical applications through additional demos. Enhance your understanding of LLM deployment and management across different hardware configurations in this comprehensive 34-minute presentation.
Syllabus
intro
preamble
all open source
demo
why not just openai?
why llamaedge api server?
llamaedfe is a developer platform
demo
dev
ops
demo
thank you
Taught by
Conf42
Related Courses
Cloud Computing Concepts, Part 1University of Illinois at Urbana-Champaign via Coursera Cloud Computing Concepts: Part 2
University of Illinois at Urbana-Champaign via Coursera Reliable Distributed Algorithms - Part 1
KTH Royal Institute of Technology via edX Introduction to Apache Spark and AWS
University of London International Programmes via Coursera Réalisez des calculs distribués sur des données massives
CentraleSupélec via OpenClassrooms