Building a Chat App with Ollama-py and Streamlit in Python
Offered By: Decoder via YouTube
Course Description
Overview
Learn how to build a chat application using Python, Ollama-py, and Streamlit in this 22-minute tutorial. Explore the Ollama Python library's key methods, including list(), show(), and chat(). Set up a Python environment and dive into Streamlit for creating the user interface. Follow along as the instructor guides you through constructing the chat app step-by-step, covering user input, message history, Ollama responses, model selection, and streaming responses. Gain insights into Python generators and their application in the project. By the end, you'll have created a functional LLM chat app with a user-friendly interface.
Syllabus
- Intro
- Why not use the CLI?
- Looking at the ollama-py library
- Setting up Python environment
- Reviewing Ollama functions
- list
- show
- chat
- Looking at Streamlit
- Start writing our app
- App: user input
- App: message history
- App: adding ollama response
- App: chooing a model
- Introducing generators
- App: streaming responses
- App: review
- Where to find the code
- Thank you for 2k
Taught by
Decoder
Related Courses
The GenAI Stack - From Zero to Database-Backed Support BotDocker via YouTube Ollama Crash Course: Running AI Models Locally Offline on CPU
1littlecoder via YouTube AI Anytime, Anywhere - Getting Started with LLMs on Your Laptop
Docker via YouTube Rust Ollama Tutorial - Interfacing with Ollama API Using ollama-rs
Jeremy Chone via YouTube Ollama: Libraries, Vision Models, and OpenAI Compatibility Updates
Sam Witteveen via YouTube