YoVDO

Attention and Transformers in Advanced NLP - Lecture 4

Offered By: Graham Neubig via YouTube

Tags

Machine Learning Courses Deep Learning Courses Neural Networks Courses LLaMA (Large Language Model Meta AI) Courses Attention Mechanisms Courses Transformer Architecture Courses Positional Encoding Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into the intricacies of advanced natural language processing techniques in this comprehensive lecture from CMU's CS 11-711 course. Explore the fundamental concepts of attention mechanisms and the revolutionary Transformer architecture. Gain a deep understanding of multi-head attention, positional encodings, and layer normalization. Delve into optimizers and training strategies for large language models. Examine the LLaMa architecture and its significance in the field. This 1-hour 19-minute session, led by Graham Neubig, provides a thorough exploration of cutting-edge NLP technologies that form the backbone of modern language understanding and generation systems.

Syllabus

CMU Advanced NLP Fall 2024 (4): Attention and Transformers


Taught by

Graham Neubig

Related Courses

NeRF - Representing Scenes as Neural Radiance Fields for View Synthesis
Yannic Kilcher via YouTube
Perceiver - General Perception with Iterative Attention
Yannic Kilcher via YouTube
LambdaNetworks- Modeling Long-Range Interactions Without Attention
Yannic Kilcher via YouTube
Attention Is All You Need - Transformer Paper Explained
Aleksa Gordić - The AI Epiphany via YouTube
NeRFs- Neural Radiance Fields - Paper Explained
Aladdin Persson via YouTube