YoVDO

Self-hosted LLMs Across All Your Devices and GPUs

Offered By: Conf42 via YouTube

Tags

Artificial Intelligence Courses Machine Learning Courses Distributed Computing Courses Edge Computing Courses GPU Acceleration Courses Open Source Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the world of self-hosted Large Language Models (LLMs) across various devices and GPUs in this informative conference talk from Conf42 LLMs 2024. Discover the benefits of open-source solutions and learn why alternatives to OpenAI might be preferable. Gain insights into the LlamaEdge API server and its role as a developer platform. Watch live demonstrations showcasing the implementation and operation of self-hosted LLMs. Delve into development and operational aspects, and witness practical applications through additional demos. Enhance your understanding of LLM deployment and management across different hardware configurations in this comprehensive 34-minute presentation.

Syllabus

intro
preamble
all open source
demo
why not just openai?
why llamaedge api server?
llamaedfe is a developer platform
demo
dev
ops
demo
thank you


Taught by

Conf42

Related Courses

Fog Networks and the Internet of Things
Princeton University via Coursera
AWS IoT: Developing and Deploying an Internet of Things
Amazon Web Services via edX
Business Considerations for 5G with Edge, IoT, and AI
Linux Foundation via edX
5G Strategy for Business Leaders
Linux Foundation via edX
Intel® Edge AI Fundamentals with OpenVINO™
Intel via Udacity