Intro to Sentence Embeddings with Transformers
Offered By: James Briggs via YouTube
Course Description
Overview
Learn about sentence embeddings using transformers in this informative video tutorial. Explore the evolution of natural language processing from recurrent neural networks to transformer models like BERT and GPT. Discover how sentence transformers have revolutionized semantic similarity applications. Gain insights into machine translation, cross-encoders, softmax loss approach, and label features. Follow along with a Python implementation to understand practical applications. Delve into the transformative impact of these models on tasks such as question answering, article writing, and semantic search.
Syllabus
Introduction
Machine Translation
Transform Models
CrossEncoders
Softmax Loss Approach
Label Feature
Python Implementation
Taught by
James Briggs
Related Courses
2024 Advanced Machine Learning and Deep Learning ProjectsUdemy Fine-Tune Sentence Transformers the OG Way - With NLI Softmax Loss
James Briggs via YouTube CMU Advanced NLP: Bias and Fairness
Graham Neubig via YouTube CMU Advanced NLP: Pre-training Methods
Graham Neubig via YouTube CMU Neural Nets for NLP: Model Interpretation
Graham Neubig via YouTube