Privacy-Friendly Applications with Ollama, Vector Functions, and LangChainJS
Offered By: JSConf via YouTube
Course Description
Overview
Explore privacy-friendly AI application development in this JSConf talk by Pratim Bhosale. Learn how to build AI applications that keep data local using Ollama for running Large Language Models (LLMs) on your computer. Discover LangChain's capabilities in creating versatile agents for autonomous task handling while protecting sensitive information. Gain insights into cloud-based AI privacy concerns and the significance of local AI solutions. Delve into generating embeddings with Ollama for vector searches and see practical demonstrations of LangChain agents performing document summarization and API interactions. Examine real-world use cases and understand how these tools can be applied to maintain data privacy in AI applications. Benefit from the expertise of Pratim Bhosale, a Full Stack Developer, Developer Advocate at SurrealDB, and experienced speaker in the tech community.
Syllabus
Privacy-Friendly Applications with Ollama, Vector Functions, and LangChainJS by Pratim Bhosale
Taught by
JSConf
Related Courses
The GenAI Stack - From Zero to Database-Backed Support BotDocker via YouTube Ollama Crash Course: Running AI Models Locally Offline on CPU
1littlecoder via YouTube AI Anytime, Anywhere - Getting Started with LLMs on Your Laptop
Docker via YouTube Rust Ollama Tutorial - Interfacing with Ollama API Using ollama-rs
Jeremy Chone via YouTube Ollama: Libraries, Vision Models, and OpenAI Compatibility Updates
Sam Witteveen via YouTube