Stop Hallucinations and Half-Truths in Generative Search - Haystack US 2023
Offered By: OpenSource Connections via YouTube
Course Description
Overview
Explore strategies to mitigate hallucinations and inaccuracies in generative search systems during this 43-minute conference talk from Haystack US 2023. Dive into the challenges of integrating Large Language Models (LLMs) into search systems and learn proven solutions to maintain user trust. Discover techniques such as reranking, user warnings, fact-checking systems, and effective LLM usage patterns, prompting, and fine-tuning. Gain insights from Colin Harman, Head of Technology at Nesh, as he shares his expertise in combining cutting-edge NLP technologies with deep domain understanding to solve complex problems in heavy industries.
Syllabus
Haystack US 2023 - Colin Harman: Stop Hallucinations and Half-Truths in Generative Search
Taught by
OpenSource Connections
Related Courses
TensorFlow: Working with NLPLinkedIn Learning Introduction to Video Editing - Video Editing Tutorials
Great Learning via YouTube HuggingFace Crash Course - Sentiment Analysis, Model Hub, Fine Tuning
Python Engineer via YouTube GPT3 and Finetuning the Core Objective Functions - A Deep Dive
David Shapiro ~ AI via YouTube How to Build a Q&A AI in Python - Open-Domain Question-Answering
James Briggs via YouTube