Practical LLM Fine-Tuning for Semantic Search
Offered By: MLCon | Machine Learning Conference via YouTube
Course Description
Overview
Explore fine-tuning large language models (LLMs) for semantic search in this conference talk by Dr. Roman Grebennikov at MLcon Munich 2024. Delve into customizing models for specific domains such as medicine, law, or hardware using open-source tools like sentence-transformers and nixietune. Gain insights into data requirements, training bi-encoders and cross-encoders, and achieving quality improvements with a single GPU. Overcome limitations of standard semantic search methods, fine-tune models for better performance in specialized fields, and acquire practical knowledge on data needs and training techniques. Examine real-world examples of enhanced semantic search and learn how to boost semantic search capabilities with expert guidance. The 49-minute talk provides valuable information for those looking to improve their semantic search implementations in various specialized domains.
Syllabus
Practical LLM Fine Tuning For Semantic Search | Dr. Roman Grebennikov
Taught by
MLCon | Machine Learning Conference
Related Courses
TensorFlow: Working with NLPLinkedIn Learning Introduction to Video Editing - Video Editing Tutorials
Great Learning via YouTube HuggingFace Crash Course - Sentiment Analysis, Model Hub, Fine Tuning
Python Engineer via YouTube GPT3 and Finetuning the Core Objective Functions - A Deep Dive
David Shapiro ~ AI via YouTube How to Build a Q&A AI in Python - Open-Domain Question-Answering
James Briggs via YouTube