YoVDO

AugSBERT: Domain Transfer for Sentence Transformers

Offered By: James Briggs via YouTube

Tags

BERT Courses Language Models Courses Sentence Transformers Courses

Course Description

Overview

Learn how to effectively transfer information from out-of-domain datasets to your target domain using the AugSBERT training strategy in this comprehensive video tutorial. Discover techniques for quickly assessing source dataset alignment with your target domain, and explore the step-by-step process of implementing the AugSBERT domain-transfer training strategy. Gain insights into training source cross-encoders, labeling target data, and evaluating bi-encoder performance to optimize your language models for semantic search applications. Perfect for NLP practitioners and researchers looking to enhance their model's performance through domain transfer techniques.

Syllabus

Why Use Domain Transfer
Strategy Outline
Train Source Cross-Encoder
Cross-Encoder Outcome
Labeling Target Data
Training Bi-encoder
Evaluator Bi-encoder Performance
Final Points


Taught by

James Briggs

Related Courses

Semantic Search for AI - Testing Out Qdrant Neural Search
David Shapiro ~ AI via YouTube
How to Use OpenAI Whisper to Fix YouTube Search
James Briggs via YouTube
Spotify's Podcast Search Explained
James Briggs via YouTube
Is GPL the Future of Sentence Transformers - Generative Pseudo-Labeling Deep Dive
James Briggs via YouTube
Train Sentence Transformers by Generating Queries - GenQ
James Briggs via YouTube