Today Unsupervised Sentence Transformers, Tomorrow Skynet - How TSDAE Works
Offered By: James Briggs via YouTube
Course Description
Overview
Explore the world of unsupervised sentence transformers in this comprehensive 44-minute video tutorial. Dive into the challenges of adapting pretrained transformers for meaningful sentence vector production, especially in domains and languages with limited labeled data. Learn about the Transformer-based Sequential Denoising Auto-Encoder (TSDAE) approach as an alternative to supervised fine-tuning methods. Discover the process of data preparation, model initialization, training, and evaluation for TSDAE. Compare the effectiveness of TSDAE with supervised methods and understand its advantages in scenarios with scarce labeled data. Gain insights into language embedding importance, supervised techniques like Natural Language Inference and Semantic Textual Similarity, and the potential of multilingual training.
Syllabus
Why Language Embedding Matters
Supervised Methods
Natural Language Inference
Semantic Textual Similarity
Multilingual Training
TSDAE Unsupervised
Data Preparation
Initialize Model
Model Training
NLTK Error
Evaluation
TSDAE vs Supervised Methods
Why TSDAE is Cool
Taught by
James Briggs
Related Courses
Semantic Search for AI - Testing Out Qdrant Neural SearchDavid Shapiro ~ AI via YouTube How to Use OpenAI Whisper to Fix YouTube Search
James Briggs via YouTube Spotify's Podcast Search Explained
James Briggs via YouTube Is GPL the Future of Sentence Transformers - Generative Pseudo-Labeling Deep Dive
James Briggs via YouTube Train Sentence Transformers by Generating Queries - GenQ
James Briggs via YouTube