YoVDO

Stop Hallucinations and Half-Truths in Generative Search - Haystack US 2023

Offered By: OpenSource Connections via YouTube

Tags

Information Retrieval Courses Fine-Tuning Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore strategies to mitigate hallucinations and inaccuracies in generative search systems during this 43-minute conference talk from Haystack US 2023. Dive into the challenges of integrating Large Language Models (LLMs) into search systems and learn proven solutions to maintain user trust. Discover techniques such as reranking, user warnings, fact-checking systems, and effective LLM usage patterns, prompting, and fine-tuning. Gain insights from Colin Harman, Head of Technology at Nesh, as he shares his expertise in combining cutting-edge NLP technologies with deep domain understanding to solve complex problems in heavy industries.

Syllabus

Haystack US 2023 - Colin Harman: Stop Hallucinations and Half-Truths in Generative Search


Taught by

OpenSource Connections

Related Courses

TensorFlow: Working with NLP
LinkedIn Learning
Introduction to Video Editing - Video Editing Tutorials
Great Learning via YouTube
HuggingFace Crash Course - Sentiment Analysis, Model Hub, Fine Tuning
Python Engineer via YouTube
GPT3 and Finetuning the Core Objective Functions - A Deep Dive
David Shapiro ~ AI via YouTube
How to Build a Q&A AI in Python - Open-Domain Question-Answering
James Briggs via YouTube