YoVDO

Run Local LLMs on Hardware from $50 to $50,000 - Testing and Comparison

Offered By: Dave's Garage via YouTube

Tags

Ollama Courses Raspberry Pi Courses LLaMA (Large Language Model Meta AI) Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the performance of local large language models across a diverse range of hardware in this comprehensive video comparison. Witness firsthand testing of Llama 3.1 and Llama 3.2 using Ollama on devices ranging from a budget-friendly Raspberry Pi 4 to a high-end Dell AI workstation. Learn how to install Ollama on Windows and observe the capabilities of various systems, including a Herk Orion Mini PC, a gaming rig with a Threadripper 3970X, an M2 Mac Pro, and a powerful workstation featuring a Threadripper 7995WX with NVIDIA 6000 Ada GPUs. Discover which hardware configurations excel at handling different model sizes, including the massive 405-billion parameter model. Gain insights into the practical aspects of running LLMs locally and understand the trade-offs between cost and performance across this wide spectrum of computing devices.

Syllabus

Run Local LLMs on Hardware from $50 to $50,000 - We Test and Compare!


Taught by

Dave's Garage

Related Courses

The GenAI Stack - From Zero to Database-Backed Support Bot
Docker via YouTube
Ollama Crash Course: Running AI Models Locally Offline on CPU
1littlecoder via YouTube
AI Anytime, Anywhere - Getting Started with LLMs on Your Laptop
Docker via YouTube
Rust Ollama Tutorial - Interfacing with Ollama API Using ollama-rs
Jeremy Chone via YouTube
Ollama: Libraries, Vision Models, and OpenAI Compatibility Updates
Sam Witteveen via YouTube