YoVDO

Run Local LLMs on Hardware from $50 to $50,000 - Testing and Comparison

Offered By: Dave's Garage via YouTube

Tags

Ollama Courses Raspberry Pi Courses LLaMA (Large Language Model Meta AI) Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the performance of local large language models across a diverse range of hardware in this comprehensive video comparison. Witness firsthand testing of Llama 3.1 and Llama 3.2 using Ollama on devices ranging from a budget-friendly Raspberry Pi 4 to a high-end Dell AI workstation. Learn how to install Ollama on Windows and observe the capabilities of various systems, including a Herk Orion Mini PC, a gaming rig with a Threadripper 3970X, an M2 Mac Pro, and a powerful workstation featuring a Threadripper 7995WX with NVIDIA 6000 Ada GPUs. Discover which hardware configurations excel at handling different model sizes, including the massive 405-billion parameter model. Gain insights into the practical aspects of running LLMs locally and understand the trade-offs between cost and performance across this wide spectrum of computing devices.

Syllabus

Run Local LLMs on Hardware from $50 to $50,000 - We Test and Compare!


Taught by

Dave's Garage

Related Courses

Develop Java Embedded Applications Using a Raspberry Pi
Oracle via Independent
Introducción a Raspberry Pi (Ver-2)
Galileo University via Independent
Interfacing with the Raspberry Pi
University of California, Irvine via Coursera
Robotic Motion Systems
University of California, Irvine via Coursera
The Raspberry Pi Platform and Python Programming for the Raspberry Pi
University of California, Irvine via Coursera