Private, Local AI: Running Open-Source LLMs Locally
Offered By: MLOps World: Machine Learning in Production via YouTube
Course Description
Overview
Explore the world of private AI in this 37-minute conference talk from MLOps World: Machine Learning in Production. Dive into Sanctum AI, a private AI application that enables running full-featured, open-source Large Language Models (LLMs) locally on Mac or PC. Learn how users can access any open-source LLM available on Hugging Face in an encrypted, local environment. Discover the benefits of private AI for users looking to fine-tune models on sensitive data, such as uploading and chatting with PDFs or financial models. Gain insights into the development of Sanctum and the importance of making leading open-source language models easily accessible in a secure, local setting.
Syllabus
Private, Local AI
Taught by
MLOps World: Machine Learning in Production
Related Courses
Creating Versatile AI Agents Through WebAssembly and RustLinux Foundation via YouTube Building a Q&A App with RAG, LangChain, and Open-Source LLMs - Step-by-Step Guide
Code With Aarohi via YouTube Self-Hosted LLM Agent on Your Own Laptop or Edge Device
CNCF [Cloud Native Computing Foundation] via YouTube Open Source LLMs: Viable for Production or a Low-Quality Toy?
Anyscale via YouTube GPT-4 vs Open Source LLMs: Epic Rap Battles Test Creativity with AutoGen
Data Centric via YouTube