YoVDO

Run Local LLMs on Hardware from $50 to $50,000 - Testing and Comparison

Offered By: Dave's Garage via YouTube

Tags

Ollama Courses Raspberry Pi Courses LLaMA (Large Language Model Meta AI) Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the performance of local large language models across a diverse range of hardware in this comprehensive video comparison. Witness firsthand testing of Llama 3.1 and Llama 3.2 using Ollama on devices ranging from a budget-friendly Raspberry Pi 4 to a high-end Dell AI workstation. Learn how to install Ollama on Windows and observe the capabilities of various systems, including a Herk Orion Mini PC, a gaming rig with a Threadripper 3970X, an M2 Mac Pro, and a powerful workstation featuring a Threadripper 7995WX with NVIDIA 6000 Ada GPUs. Discover which hardware configurations excel at handling different model sizes, including the massive 405-billion parameter model. Gain insights into the practical aspects of running LLMs locally and understand the trade-offs between cost and performance across this wide spectrum of computing devices.

Syllabus

Run Local LLMs on Hardware from $50 to $50,000 - We Test and Compare!


Taught by

Dave's Garage

Related Courses

LLaMA- Open and Efficient Foundation Language Models - Paper Explained
Yannic Kilcher via YouTube
Alpaca & LLaMA - Can it Compete with ChatGPT?
Venelin Valkov via YouTube
Experimenting with Alpaca & LLaMA
Aladdin Persson via YouTube
What's LLaMA? ChatLLaMA? - And Some ChatGPT/InstructGPT
Aladdin Persson via YouTube
Llama Index - Step by Step Introduction
echohive via YouTube