YoVDO

Fine-Tune Sentence Transformers the OG Way - With NLI Softmax Loss

Offered By: James Briggs via YouTube

Tags

Deep Learning Courses Machine Learning Courses PyTorch Courses Information Retrieval Courses Clustering Courses Data Preprocessing Courses Sentence Embedding Courses Sentence Transformers Courses

Course Description

Overview

Explore the fine-tuning process of sentence transformers using Natural Language Inference (NLI) softmax loss in this comprehensive video tutorial. Learn about the training approach used in the first sentence-BERT (SBERT) model for producing sentence embeddings. Dive into the preprocessing of NLI data, implement the PyTorch process, and utilize the Sentence-Transformers library. Examine the results and understand why this method, while historically significant, has been superseded by more advanced techniques. Gain insights into applications such as semantic textual similarity, clustering, and information retrieval using concept-based embeddings.

Syllabus

Intro
NLI Fine-tuning
Softmax Loss Training Overview
Preprocessing NLI Data
PyTorch Process
Using Sentence-Transformers
Results
Outro


Taught by

James Briggs

Related Courses

Genomic Data Science and Clustering (Bioinformatics V)
University of California, San Diego via Coursera
用Python玩转数据 Data Processing Using Python
Nanjing University via Coursera
Data Mining Project
University of Illinois at Urbana-Champaign via Coursera
Advanced Business Analytics Capstone
University of Colorado Boulder via Coursera
Data Mining: Theories and Algorithms for Tackling Big Data | 数据挖掘:理论与算法
Tsinghua University via edX