Customer Support Chatbot Using Custom Knowledge Base with LangChain and Private LLM
Offered By: Venelin Valkov via YouTube
Course Description
Overview
Build a customer support chatbot that leverages custom knowledge from an FAQ help center using an open-source LLM and free embeddings through LangChain. Learn to create a chatbot capable of delivering real-time responses with streaming on a single GPU (T4). Explore the process of constructing the chatbot, including setting up Google Colab, creating a dataset, integrating an LLM from HuggingFace, embedding data, implementing conversational and QA chains, and developing a chatbot class. Evaluate the chatbot's response quality by testing it with various questions and gain insights into building effective AI-powered customer support solutions.
Syllabus
- Introduction
- Text Tutorial
- FAQ Data Source
- Google Colab Setup
- Create Dataset
- LLM from HuggingFace in LangChain
- Embedding Data
- Conversational Chain
- QA Chain
- Chatbot Class
- Conclusion
Taught by
Venelin Valkov
Related Courses
Google BARD and ChatGPT AI for Increased ProductivityUdemy Bringing LLM to the Enterprise - Training From Scratch or Just Fine-Tune With Cerebras-GPT
Prodramp via YouTube Generative AI and Long-Term Memory for LLMs
James Briggs via YouTube Extractive Q&A With Haystack and FastAPI in Python
James Briggs via YouTube OpenAssistant First Models Are Here! - Open-Source ChatGPT
Yannic Kilcher via YouTube