Self-Hosted LLMs: A Practical Guide
Offered By: DevConf via YouTube
Course Description
Overview
Explore the world of self-hosted large language models (LLMs) in this informative 35-minute conference talk from DevConf.US 2024. Learn how to overcome the complexities of deploying and managing LLMs as speakers Hema Veeradhi and Aakanksha Duggal provide a comprehensive introductory guide. Discover the process of selecting appropriate open source LLM models from HuggingFace, containerizing them with Podman, and creating model serving and inference pipelines. Gain insights into the advantages of self-hosted setups, including increased flexibility in model training, enhanced data privacy, and reduced operational costs. By the end of the talk, acquire the necessary skills and knowledge to navigate the exciting path of self-hosting LLMs on your own laptop using open source tools and frameworks.
Syllabus
Self-Hosted LLMs: A Practical Guide - DevConf.US 2024
Taught by
DevConf
Related Courses
How Google does Machine Learning en EspaƱolGoogle Cloud via Coursera Creating Custom Callbacks in Keras
Coursera Project Network via Coursera Automatic Machine Learning with H2O AutoML and Python
Coursera Project Network via Coursera AI in Healthcare Capstone
Stanford University via Coursera AutoML con Pycaret y TPOT
Coursera Project Network via Coursera