Generative AI and Long-Term Memory for LLMs
Offered By: James Briggs via YouTube
Course Description
Overview
Explore the concept of Generative AI and its application in Long-Term Memory for Large Language Models (LLMs) in this informative video. Dive into Generative Question-Answering (GQA) systems and learn how retrieval augmentation can enhance LLM performance. Discover the potential of OpenAI's GPT-3, Cohere, and open-source Hugging Face models in creating straightforward GQA systems. Examine the OP stack for retrieval augmented GQA and witness practical examples in action. Gain insights into the future implications of Generative AI technology and its far-reaching impact beyond current expectations.
Syllabus
What is generative AI
Generative question answering
Two options for helping LLMs
Long-term memory in LLMs
OP stack for retrieval augmented GQA
Testing a few examples
Final thoughts on Generative AI
Taught by
James Briggs
Related Courses
Metadata Filtering for Vector Search - Latest Filter TechJames Briggs via YouTube Cohere vs. OpenAI Embeddings - Multilingual Search
James Briggs via YouTube Building the Future with LLMs, LangChain, & Pinecone
Pinecone via YouTube Supercharging Semantic Search with Pinecone and Cohere
Pinecone via YouTube Preventing Déjà Vu - Vector Similarity Search for Security Alerts, with Expel and Pinecone
Pinecone via YouTube