The Math Behind Attention Mechanisms
Offered By: Serrano.Academy via YouTube
Course Description
Overview
Dive into the mathematical foundations of attention mechanisms in this 38-minute video, part of a three-part series demystifying Transformer models. Explore key concepts such as embeddings, context, similarity, and attention through visuals and friendly examples. Learn about the Keys, Queries, and Values matrices, essential components of the attention mechanism. Gain a deeper understanding of how these elements work together to create powerful language models. Perfect for those seeking to grasp the technical underpinnings of modern natural language processing techniques.
Syllabus
Introduction
Recap: Embeddings and Context
Similarity
Attention
The Keys and Queries Matrices
The Values Matrix
Conclusion
Taught by
Serrano.Academy
Related Courses
Sequence ModelsDeepLearning.AI via Coursera Modern Natural Language Processing in Python
Udemy Stanford Seminar - Transformers in Language: The Development of GPT Models Including GPT-3
Stanford University via YouTube Long Form Question Answering in Haystack
James Briggs via YouTube Spotify's Podcast Search Explained
James Briggs via YouTube