How to Use LLMs for Fact Checking - Advanced Techniques and Comparisons
Offered By: Trelis Research via YouTube
Course Description
Overview
Explore advanced techniques for using Large Language Models (LLMs) in fact-checking applications in this 41-minute video tutorial from Trelis Research. Learn about citation verification, enforcing structured responses with regex and JSON, and context focusing. Compare long context and retrieval approaches to fact-checking using a modified Irish Constitution as a test document. Discover how to set up structured responses with Gemini and OpenAI GPT-4, and explore fact-checking methods using BM25 and cosine similarity retrieval. Gain insights into the effectiveness of different approaches and access valuable resources for implementing these techniques in your own projects.
Syllabus
Intro - fact checking with LLMs
Video Overview
#1 Citation Verification Technique
#2 Enforcing Structured Responses regex / json with OpenAI
#3 Context Focusing
Comparing Long Context vs Retrieval approaches to fact-checking
Test Document Preparation adding errors to the Irish Constitution
Long Context Fact Checking whole document as context
Long Context Fact Checking with Structured Responses
Setting up structured responses with Gemini and Openai GPT-4o
Hacking structured responses with Claude
Fact checking with bm25 and cosine similarity retrieval
Results: Long Context vs Structured vs Retrieval Approaches
Video Resources
Taught by
Trelis Research
Related Courses
Learn Google Bard and GeminiUdemy Gemini and the Future of Generative AI Tools - Interview with Simon Tokumine
TensorFlow via YouTube Gemini and GPT Sales Agents with RAG - Comparison and Implementation
echohive via YouTube Building a Streamlit Interface for Unified Chat with Multiple LLMs
echohive via YouTube Gemini 1.5 Pro for Code - Building LLM Agents with CrewAI
Sam Witteveen via YouTube