Intro to Sentence Embeddings with Transformers
Offered By: James Briggs via YouTube
Course Description
Overview
Learn about sentence embeddings using transformers in this informative video tutorial. Explore the evolution of natural language processing from recurrent neural networks to transformer models like BERT and GPT. Discover how sentence transformers have revolutionized semantic similarity applications. Gain insights into machine translation, cross-encoders, softmax loss approach, and label features. Follow along with a Python implementation to understand practical applications. Delve into the transformative impact of these models on tasks such as question answering, article writing, and semantic search.
Syllabus
Introduction
Machine Translation
Transform Models
CrossEncoders
Softmax Loss Approach
Label Feature
Python Implementation
Taught by
James Briggs
Related Courses
Natural Language ProcessingColumbia University via Coursera Natural Language Processing
Stanford University via Coursera Introduction to Natural Language Processing
University of Michigan via Coursera moocTLH: Nuevos retos en las tecnologĂas del lenguaje humano
Universidad de Alicante via MirĂadax Natural Language Processing
Indian Institute of Technology, Kharagpur via Swayam