YoVDO

Embedding Models: From Architecture to Implementation

Offered By: DeepLearning.AI via Coursera

Tags

BERT Courses Word Embeddings Courses Semantic Search Courses Contrastive Learning Courses Sentence Embedding Courses Retrieval Augmented Generation (RAG) Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Join our new short course, Embedding Models: From Architecture to Implementation! Learn from Ofer Mendelevitch, Head of Developer Relations at Vectara. This course goes into the details of the architecture and capabilities of embedding models, which are used in many AI applications to capture the meaning of words and sentences. You will learn about the evolution of embedding models, from word to sentence embeddings, and build and train a simple dual encoder model. This hands-on approach will help you understand the technical concepts behind embedding models and how to use them effectively. In detail, you’ll: 1. Learn about word embedding, sentence embedding, and cross-encoder models; and how they can be used in RAG. 2. Understand how transformer models, specifically BERT (Bi-directional Encoder Representations from Transformers), are trained and used in semantic search systems. 3. Gain knowledge of the evolution of sentence embedding and understand how the dual encoder architecture was formed. 4. Use a contrastive loss to train a dual encoder model, with one encoder trained for questions and another for the responses. 5. Utilize separate encoders for question and answer in a RAG pipeline and see how it affects the retrieval compared to using a single encoder model. By the end of this course, you will understand word, sentence, and cross-encoder embedding models, and how transformer-based models like BERT are trained and used in semantic search. You will also learn how to train dual encoder models with contrastive loss and evaluate their impact on retrieval in a RAG pipeline.

Syllabus

  • Embedding Models: From Architecture to Implementation
    • Join our new short course, Embedding Models: From Architecture to Implementation! Learn from Ofer Mendelevitch, Head of Developer Relations at Vectara. This course goes into the details of the architecture and capabilities of embedding models, which are used in many AI applications to capture the meaning of words and sentences. You will learn about the evolution of embedding models, from word to sentence embeddings, and build and train a simple dual encoder model. This hands-on approach will help you understand the technical concepts behind embedding models and how to use them effectively. In detail, you’ll: 1. Learn about word embedding, sentence embedding, and cross-encoder models; and how they can be used in RAG. 2. Understand how transformer models, specifically BERT (Bi-directional Encoder Representations from Transformers), are trained and used in semantic search systems. 3. Gain knowledge of the evolution of sentence embedding and understand how the dual encoder architecture was formed. 4. Use a contrastive loss to train a dual encoder model, with one encoder trained for questions and another for the responses. 5. Utilize separate encoders for question and answer in a RAG pipeline and see how it affects the retrieval compared to using a single encoder model. By the end of this course, you will understand word, sentence, and cross-encoder embedding models, and how transformer-based models like BERT are trained and used in semantic search. You will also learn how to train dual encoder models with contrastive loss and evaluate their impact on retrieval in a RAG pipeline.

Taught by

Ofer Mendelevitch

Related Courses

U&P AI - Natural Language Processing (NLP) with Python
Udemy
What's New in Cognitive Search and Cool Frameworks with PyTorch - Episode 5
Microsoft via YouTube
Stress Testing Qdrant - Semantic Search with 90,000 Vectors - Lightning Fast Search Microservice
David Shapiro ~ AI via YouTube
Semantic Search for AI - Testing Out Qdrant Neural Search
David Shapiro ~ AI via YouTube
Spotify's Podcast Search Explained
James Briggs via YouTube