YoVDO

AugSBERT: Domain Transfer for Sentence Transformers

Offered By: James Briggs via YouTube

Tags

BERT Courses Language Models Courses Sentence Transformers Courses

Course Description

Overview

Learn how to effectively transfer information from out-of-domain datasets to your target domain using the AugSBERT training strategy in this comprehensive video tutorial. Discover techniques for quickly assessing source dataset alignment with your target domain, and explore the step-by-step process of implementing the AugSBERT domain-transfer training strategy. Gain insights into training source cross-encoders, labeling target data, and evaluating bi-encoder performance to optimize your language models for semantic search applications. Perfect for NLP practitioners and researchers looking to enhance their model's performance through domain transfer techniques.

Syllabus

Why Use Domain Transfer
Strategy Outline
Train Source Cross-Encoder
Cross-Encoder Outcome
Labeling Target Data
Training Bi-encoder
Evaluator Bi-encoder Performance
Final Points


Taught by

James Briggs

Related Courses

Microsoft Bot Framework and Conversation as a Platform
Microsoft via edX
Unlocking the Power of OpenAI for Startups - Microsoft for Startups
Microsoft via YouTube
Improving Customer Experiences with Speech to Text and Text to Speech
Microsoft via YouTube
Stanford Seminar - Deep Learning in Speech Recognition
Stanford University via YouTube
Select Topics in Python: Natural Language Processing
Codio via Coursera