YoVDO

Running Uncensored and Open Source LLMs on Your Local Machine

Offered By: JetBrains via YouTube

Tags

Machine Learning Courses Ollama Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the world of uncensored and open-source large language models (LLMs) that can be run on local hardware in this comprehensive 1-hour 11-minute talk. Dive into the Ollama system, which enables downloading and running open-source models without internet-based restrictions. Learn how to access AI models programmatically using the latest Java features, including sealed interfaces, records, and pattern matching. Discover the advantages of running LLMs locally, such as increased privacy and the ability to work with models that have limited guard rails. Join Java Champion and author Kenneth Kousen as he demonstrates how to leverage these powerful tools for more flexible and secure AI applications.

Syllabus

Running Uncensored and Open Source LLMs on Your Local Machine


Taught by

IntelliJ IDEA by JetBrains

Tags

Related Courses

The GenAI Stack - From Zero to Database-Backed Support Bot
Docker via YouTube
Ollama Crash Course: Running AI Models Locally Offline on CPU
1littlecoder via YouTube
AI Anytime, Anywhere - Getting Started with LLMs on Your Laptop
Docker via YouTube
Rust Ollama Tutorial - Interfacing with Ollama API Using ollama-rs
Jeremy Chone via YouTube
Ollama: Libraries, Vision Models, and OpenAI Compatibility Updates
Sam Witteveen via YouTube