Today Unsupervised Sentence Transformers, Tomorrow Skynet - How TSDAE Works
Offered By: James Briggs via YouTube
Course Description
Overview
Explore the world of unsupervised sentence transformers in this comprehensive 44-minute video tutorial. Dive into the challenges of adapting pretrained transformers for meaningful sentence vector production, especially in domains and languages with limited labeled data. Learn about the Transformer-based Sequential Denoising Auto-Encoder (TSDAE) approach as an alternative to supervised fine-tuning methods. Discover the process of data preparation, model initialization, training, and evaluation for TSDAE. Compare the effectiveness of TSDAE with supervised methods and understand its advantages in scenarios with scarce labeled data. Gain insights into language embedding importance, supervised techniques like Natural Language Inference and Semantic Textual Similarity, and the potential of multilingual training.
Syllabus
Why Language Embedding Matters
Supervised Methods
Natural Language Inference
Semantic Textual Similarity
Multilingual Training
TSDAE Unsupervised
Data Preparation
Initialize Model
Model Training
NLTK Error
Evaluation
TSDAE vs Supervised Methods
Why TSDAE is Cool
Taught by
James Briggs
Related Courses
Building a unique NLP project: 1984 book vs 1984 albumCoursera Project Network via Coursera Exam Prep AI-102: Microsoft Azure AI Engineer Associate
Whizlabs via Coursera Amazon Echo Reviews Sentiment Analysis Using NLP
Coursera Project Network via Coursera Amazon Translate: Translate documents with batch translation
Coursera Project Network via Coursera Analyze Text Data with Yellowbrick
Coursera Project Network via Coursera