Embeddings vs Fine-Tuning - Part 1: Understanding and Implementing Embeddings
Offered By: Trelis Research via YouTube
Course Description
Overview
Syllabus
Should I use embeddings or fine-tuning?
How does semantic search work?
How to use embeddings with a language model?
The two keys to success with embeddings
How do cosine similarity and dot product similarity work?
How to embed a dataset? Touch Rugby Rules
How to prepare data for embedding?
Chunking a dataset for embeddings
What length of embeddings should I use?
Loading Llama 2 13B with GPTQ in Google Colab
Installing Llama 2 13B with GPTQ
Llama Performance without Embeddings
What embeddings should I use?
How to use OpenAI Embeddings
Using SBERT or "Marco" embeddings
How to create embeddings from data.
Calculating similarity using the dot product
Evaluating performance using embeddings
Using ChatGPT to Evaluate Performance of Embeddings
Llama 13-B Incorrect, GPT-4 Correct
Llama 13-B and GPT-4 Incorrect
Embeddings incorrect AND Llama 13B and GPT-4 Hallucinate
Summary of Embeddings Performance with Llama 2 and GPT-4
Pro tips for further improving performance with embeddings
ColBERT approach to improve embeddings
Top Tips for using Embeddings with Language Models
Taught by
Trelis Research
Related Courses
Microsoft Bot Framework and Conversation as a PlatformMicrosoft via edX Unlocking the Power of OpenAI for Startups - Microsoft for Startups
Microsoft via YouTube Improving Customer Experiences with Speech to Text and Text to Speech
Microsoft via YouTube Stanford Seminar - Deep Learning in Speech Recognition
Stanford University via YouTube Select Topics in Python: Natural Language Processing
Codio via Coursera