Ollama and Python for Local AI LLM Systems
Offered By: Eli the Computer Guy via YouTube
Course Description
Overview
Learn how to set up and utilize Ollama and Python for local AI Large Language Model (LLM) systems in this 30-minute tutorial. Explore the installation process for Ollama and various models, discover how to pull models and run Ollama from the command line, and master the integration of Python with Ollama for enhanced functionality. Gain valuable insights through a practical demonstration and conclude with final thoughts on implementing these powerful tools for local AI development.
Syllabus
Introduction
Demonstration
Installing Ollama and Models
Pulling Models and Running Ollama at the Shell
Using Python with Ollama
Final Thoughts
Taught by
Eli the Computer Guy
Related Courses
The GenAI Stack - From Zero to Database-Backed Support BotDocker via YouTube Ollama Crash Course: Running AI Models Locally Offline on CPU
1littlecoder via YouTube AI Anytime, Anywhere - Getting Started with LLMs on Your Laptop
Docker via YouTube Rust Ollama Tutorial - Interfacing with Ollama API Using ollama-rs
Jeremy Chone via YouTube Ollama: Libraries, Vision Models, and OpenAI Compatibility Updates
Sam Witteveen via YouTube