Fine-Tune High Performance Sentence Transformers With Multiple Negatives Ranking
Offered By: James Briggs via YouTube
Course Description
Overview
Explore the process of fine-tuning high-performance sentence transformers using Multiple Negatives Ranking (MNR) loss in this 37-minute video tutorial. Learn about the evolution of transformer-produced sentence embeddings, from BERT cross-encoders to SBERT and beyond. Discover how MNR loss has revolutionized the field, enabling newer models to quickly outperform their predecessors. Dive into the implementation of MNR loss for fine-tuning sentence transformers, covering both a detailed approach and a simplified method using the sentence-transformers library. Gain insights into NLI training data, preprocessing techniques, and visual representations of SBERT fine-tuning and MNR loss. Compare results and understand the impact of this advanced technique on sentence embedding quality.
Syllabus
Intro
NLI Training Data
Preprocessing
SBERT Finetuning Visuals
MNR Loss Visual
MNR in PyTorch
MNR in Sentence Transformers
Results
Outro
Taught by
James Briggs
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologĂas del lenguaje humano
Universidad de Alicante via MirĂadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam