Ollama and Python for Local AI LLM Systems
Offered By: Eli the Computer Guy via YouTube
Course Description
Overview
Learn how to set up and utilize Ollama and Python for local AI Large Language Model (LLM) systems in this 30-minute tutorial. Explore the installation process for Ollama and various models, discover how to pull models and run Ollama from the command line, and master the integration of Python with Ollama for enhanced functionality. Gain valuable insights through a practical demonstration and conclude with final thoughts on implementing these powerful tools for local AI development.
Syllabus
Introduction
Demonstration
Installing Ollama and Models
Pulling Models and Running Ollama at the Shell
Using Python with Ollama
Final Thoughts
Taught by
Eli the Computer Guy
Related Courses
Introduction to Cloud Foundry and Cloud Native Software ArchitectureLinux Foundation via edX The Unix Workbench
Johns Hopkins University via Coursera Введение в Linux
Bioinformatics Institute via Stepik Linux Basics: The Command Line Interface
Dartmouth College via edX Sistemas operativos y tú: Convertirse en un usuario avanzado
Crece con Google via Coursera