YoVDO

Retraining vs RAG vs Context - Using Local Data on Large Language Models

Offered By: Dave's Garage via YouTube

Tags

Retrieval Augmented Generation Courses Artificial Intelligence Courses Machine Learning Courses Data Integration Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the methods of expanding large language model functionality through retraining, retrieval augmented generation (RAG), and context documents in this informative video. Learn how these techniques can be applied to both local and online models to enhance their capabilities. Gain insights into the differences between these approaches and understand their practical applications in leveraging local data with LLMs. Discover how these methods can be used to customize and improve AI models for specific tasks or domains.

Syllabus

Retraining vs RAG vs Context: Your Local Data on LLMs!


Taught by

Dave's Garage

Related Courses

Pinecone Vercel Starter Template and RAG - Live Code Review Part 2
Pinecone via YouTube
Will LLMs Kill Search? The Future of Information Retrieval
Aleksa Gordić - The AI Epiphany via YouTube
RAG But Better: Rerankers with Cohere AI - Improving Retrieval Pipelines
James Briggs via YouTube
Advanced RAG - Contextual Compressors and Filters - Lecture 4
Sam Witteveen via YouTube
LangChain Multi-Query Retriever for RAG - Advanced Technique for Broader Vector Space Search
James Briggs via YouTube