LangChain + Retrieval Local LLMs for Retrieval QA - No OpenAI
Offered By: Sam Witteveen via YouTube
Course Description
Overview
Explore how to implement LangChain for retrieval QA without relying on OpenAI in this informative video tutorial. Learn to utilize various local language models, including Flan T5-XL, Fastchat-T5, StableVicuna, and WizardLM. Follow along with provided Colab notebooks to gain hands-on experience in setting up and using these models for effective retrieval-based question answering tasks. Discover the potential of open-source alternatives and enhance your understanding of LangChain's versatility in natural language processing applications.
Syllabus
Intro
Flan T5-XL
Fastchat-T5
StableVicuna
WizardLM
Taught by
Sam Witteveen
Related Courses
Prompt Templates for GPT-3.5 and Other LLMs - LangChainJames Briggs via YouTube Getting Started with GPT-3 vs. Open Source LLMs - LangChain
James Briggs via YouTube Chatbot Memory for Chat-GPT, Davinci + Other LLMs - LangChain
James Briggs via YouTube Chat in LangChain
James Briggs via YouTube LangChain Data Loaders, Tokenizers, Chunking, and Datasets - Data Prep
James Briggs via YouTube