YoVDO

Fine-Tune Sentence Transformers the OG Way - With NLI Softmax Loss

Offered By: James Briggs via YouTube

Tags

Deep Learning Courses Machine Learning Courses PyTorch Courses Information Retrieval Courses Clustering Courses Data Preprocessing Courses Sentence Embedding Courses Sentence Transformers Courses

Course Description

Overview

Explore the fine-tuning process of sentence transformers using Natural Language Inference (NLI) softmax loss in this comprehensive video tutorial. Learn about the training approach used in the first sentence-BERT (SBERT) model for producing sentence embeddings. Dive into the preprocessing of NLI data, implement the PyTorch process, and utilize the Sentence-Transformers library. Examine the results and understand why this method, while historically significant, has been superseded by more advanced techniques. Gain insights into applications such as semantic textual similarity, clustering, and information retrieval using concept-based embeddings.

Syllabus

Intro
NLI Fine-tuning
Softmax Loss Training Overview
Preprocessing NLI Data
PyTorch Process
Using Sentence-Transformers
Results
Outro


Taught by

James Briggs

Related Courses

Semantic Web Technologies
openHPI
أساسيات استرجاع المعلومات
Rwaq (رواق)
《gacco特別企画》Evernoteで広がるgaccoの学びスタイル (ga038)
University of Tokyo via gacco
La Web Semántica: Herramientas para la publicación y extracción efectiva de información en la Web
Pontificia Universidad Católica de Chile via Coursera
快速学习
University of Science and Technology of China via Coursera