Customer Support Chatbot Using Custom Knowledge Base with LangChain and Private LLM
Offered By: Venelin Valkov via YouTube
Course Description
Overview
Build a customer support chatbot that leverages custom knowledge from an FAQ help center using an open-source LLM and free embeddings through LangChain. Learn to create a chatbot capable of delivering real-time responses with streaming on a single GPU (T4). Explore the process of constructing the chatbot, including setting up Google Colab, creating a dataset, integrating an LLM from HuggingFace, embedding data, implementing conversational and QA chains, and developing a chatbot class. Evaluate the chatbot's response quality by testing it with various questions and gain insights into building effective AI-powered customer support solutions.
Syllabus
- Introduction
- Text Tutorial
- FAQ Data Source
- Google Colab Setup
- Create Dataset
- LLM from HuggingFace in LangChain
- Embedding Data
- Conversational Chain
- QA Chain
- Chatbot Class
- Conclusion
Taught by
Venelin Valkov
Related Courses
Prompt Templates for GPT-3.5 and Other LLMs - LangChainJames Briggs via YouTube Getting Started with GPT-3 vs. Open Source LLMs - LangChain
James Briggs via YouTube Chatbot Memory for Chat-GPT, Davinci + Other LLMs - LangChain
James Briggs via YouTube Chat in LangChain
James Briggs via YouTube LangChain Data Loaders, Tokenizers, Chunking, and Datasets - Data Prep
James Briggs via YouTube