YoVDO

AugSBERT: Domain Transfer for Sentence Transformers

Offered By: James Briggs via YouTube

Tags

BERT Courses Language Models Courses Sentence Transformers Courses

Course Description

Overview

Learn how to effectively transfer information from out-of-domain datasets to your target domain using the AugSBERT training strategy in this comprehensive video tutorial. Discover techniques for quickly assessing source dataset alignment with your target domain, and explore the step-by-step process of implementing the AugSBERT domain-transfer training strategy. Gain insights into training source cross-encoders, labeling target data, and evaluating bi-encoder performance to optimize your language models for semantic search applications. Perfect for NLP practitioners and researchers looking to enhance their model's performance through domain transfer techniques.

Syllabus

Why Use Domain Transfer
Strategy Outline
Train Source Cross-Encoder
Cross-Encoder Outcome
Labeling Target Data
Training Bi-encoder
Evaluator Bi-encoder Performance
Final Points


Taught by

James Briggs

Related Courses

Building Language Models on AWS (Japanese)
Amazon Web Services via AWS Skill Builder
Building Language Models on AWS (Korean)
Amazon Web Services via AWS Skill Builder
Building Language Models on AWS (Simplified Chinese)
Amazon Web Services via AWS Skill Builder
Building Language Models on AWS (Traditional Chinese)
Amazon Web Services via AWS Skill Builder
Introduction to ChatGPT
edX