Fine-Tune Sentence Transformers the OG Way - With NLI Softmax Loss
Offered By: James Briggs via YouTube
Course Description
Overview
Explore the fine-tuning process of sentence transformers using Natural Language Inference (NLI) softmax loss in this comprehensive video tutorial. Learn about the training approach used in the first sentence-BERT (SBERT) model for producing sentence embeddings. Dive into the preprocessing of NLI data, implement the PyTorch process, and utilize the Sentence-Transformers library. Examine the results and understand why this method, while historically significant, has been superseded by more advanced techniques. Gain insights into applications such as semantic textual similarity, clustering, and information retrieval using concept-based embeddings.
Syllabus
Intro
NLI Fine-tuning
Softmax Loss Training Overview
Preprocessing NLI Data
PyTorch Process
Using Sentence-Transformers
Results
Outro
Taught by
James Briggs
Related Courses
Semantic Search for AI - Testing Out Qdrant Neural SearchDavid Shapiro ~ AI via YouTube How to Use OpenAI Whisper to Fix YouTube Search
James Briggs via YouTube Spotify's Podcast Search Explained
James Briggs via YouTube Is GPL the Future of Sentence Transformers - Generative Pseudo-Labeling Deep Dive
James Briggs via YouTube Train Sentence Transformers by Generating Queries - GenQ
James Briggs via YouTube