Self-Hosted LLMs: A Practical Guide
Offered By: DevConf via YouTube
Course Description
Overview
Explore the world of self-hosted large language models (LLMs) in this informative 35-minute conference talk from DevConf.US 2024. Learn how to overcome the complexities of deploying and managing LLMs as speakers Hema Veeradhi and Aakanksha Duggal provide a comprehensive introductory guide. Discover the process of selecting appropriate open source LLM models from HuggingFace, containerizing them with Podman, and creating model serving and inference pipelines. Gain insights into the advantages of self-hosted setups, including increased flexibility in model training, enhanced data privacy, and reduced operational costs. By the end of the talk, acquire the necessary skills and knowledge to navigate the exciting path of self-hosting LLMs on your own laptop using open source tools and frameworks.
Syllabus
Self-Hosted LLMs: A Practical Guide - DevConf.US 2024
Taught by
DevConf
Related Courses
Getting Started with PodmanPluralsight Docker do 0 Ă Maestria: ContĂȘineres Desmistificados + EXTRAS
Udemy CompTIA Linux+: Scripting, Containers, and Automation
Pluralsight Hands-on with Podman Containers on Linux
A Cloud Guru Complete Intro to Containers (feat. Docker)
Frontend Masters