Stop Hallucinations and Half-Truths in Generative Search - Haystack US 2023
Offered By: OpenSource Connections via YouTube
Course Description
Overview
Explore strategies to mitigate hallucinations and inaccuracies in generative search systems during this 43-minute conference talk from Haystack US 2023. Dive into the challenges of integrating Large Language Models (LLMs) into search systems and learn proven solutions to maintain user trust. Discover techniques such as reranking, user warnings, fact-checking systems, and effective LLM usage patterns, prompting, and fine-tuning. Gain insights from Colin Harman, Head of Technology at Nesh, as he shares his expertise in combining cutting-edge NLP technologies with deep domain understanding to solve complex problems in heavy industries.
Syllabus
Haystack US 2023 - Colin Harman: Stop Hallucinations and Half-Truths in Generative Search
Taught by
OpenSource Connections
Related Courses
Semantic Web TechnologiesopenHPI أساسيات استرجاع المعلومات
Rwaq (رواق) 《gacco特別企画》Evernoteで広がるgaccoの学びスタイル (ga038)
University of Tokyo via gacco La Web Semántica: Herramientas para la publicación y extracción efectiva de información en la Web
Pontificia Universidad Católica de Chile via Coursera 快速学习
University of Science and Technology of China via Coursera