YoVDO

Ollama Crash Course: Running AI Models Locally Offline on CPU

Offered By: 1littlecoder via YouTube

Tags

Ollama Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn to run AI models locally and offline using only CPU in this 24-minute crash course on Ollama. Explore Ollama's introduction, run local Large Language Models (LLMs), set up LLMs as API endpoints, work with character-based LLMs, and utilize GGUF format. Gain hands-on experience with Ollama's GitHub repository, library, and project page to enhance your understanding of offline AI model deployment.

Syllabus

Just RUN AI Models Locally OFFLINE CPU!!!


Taught by

1littlecoder

Related Courses

The GenAI Stack - From Zero to Database-Backed Support Bot
Docker via YouTube
AI Anytime, Anywhere - Getting Started with LLMs on Your Laptop
Docker via YouTube
Rust Ollama Tutorial - Interfacing with Ollama API Using ollama-rs
Jeremy Chone via YouTube
Ollama: Libraries, Vision Models, and OpenAI Compatibility Updates
Sam Witteveen via YouTube
Image Annotation with LLaVA and Ollama
Sam Witteveen via YouTube