YoVDO

Fine-Tune High Performance Sentence Transformers With Multiple Negatives Ranking

Offered By: James Briggs via YouTube

Tags

Natural Language Processing (NLP) Courses PyTorch Courses Data Preprocessing Courses Sentence Transformers Courses

Course Description

Overview

Explore the process of fine-tuning high-performance sentence transformers using Multiple Negatives Ranking (MNR) loss in this 37-minute video tutorial. Learn about the evolution of transformer-produced sentence embeddings, from BERT cross-encoders to SBERT and beyond. Discover how MNR loss has revolutionized the field, enabling newer models to quickly outperform their predecessors. Dive into the implementation of MNR loss for fine-tuning sentence transformers, covering both a detailed approach and a simplified method using the sentence-transformers library. Gain insights into NLI training data, preprocessing techniques, and visual representations of SBERT fine-tuning and MNR loss. Compare results and understand the impact of this advanced technique on sentence embedding quality.

Syllabus

Intro
NLI Training Data
Preprocessing
SBERT Finetuning Visuals
MNR Loss Visual
MNR in PyTorch
MNR in Sentence Transformers
Results
Outro


Taught by

James Briggs

Related Courses

Semantic Search for AI - Testing Out Qdrant Neural Search
David Shapiro ~ AI via YouTube
How to Use OpenAI Whisper to Fix YouTube Search
James Briggs via YouTube
Spotify's Podcast Search Explained
James Briggs via YouTube
Is GPL the Future of Sentence Transformers - Generative Pseudo-Labeling Deep Dive
James Briggs via YouTube
Train Sentence Transformers by Generating Queries - GenQ
James Briggs via YouTube