YoVDO

AugSBERT: Domain Transfer for Sentence Transformers

Offered By: James Briggs via YouTube

Tags

BERT Courses Language Models Courses Sentence Transformers Courses

Course Description

Overview

Learn how to effectively transfer information from out-of-domain datasets to your target domain using the AugSBERT training strategy in this comprehensive video tutorial. Discover techniques for quickly assessing source dataset alignment with your target domain, and explore the step-by-step process of implementing the AugSBERT domain-transfer training strategy. Gain insights into training source cross-encoders, labeling target data, and evaluating bi-encoder performance to optimize your language models for semantic search applications. Perfect for NLP practitioners and researchers looking to enhance their model's performance through domain transfer techniques.

Syllabus

Why Use Domain Transfer
Strategy Outline
Train Source Cross-Encoder
Cross-Encoder Outcome
Labeling Target Data
Training Bi-encoder
Evaluator Bi-encoder Performance
Final Points


Taught by

James Briggs

Related Courses

Sentiment Analysis with Deep Learning using BERT
Coursera Project Network via Coursera
Natural Language Processing with Attention Models
DeepLearning.AI via Coursera
Fine Tune BERT for Text Classification with TensorFlow
Coursera Project Network via Coursera
Deploy a BERT question answering bot on Django
Coursera Project Network via Coursera
Generating discrete sequences: language and music
Ural Federal University via edX