YoVDO

Attention and Transformers in Advanced NLP - Lecture 4

Offered By: Graham Neubig via YouTube

Tags

Machine Learning Courses Deep Learning Courses Neural Networks Courses LLaMA (Large Language Model Meta AI) Courses Attention Mechanisms Courses Transformer Architecture Courses Positional Encoding Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into the intricacies of advanced natural language processing techniques in this comprehensive lecture from CMU's CS 11-711 course. Explore the fundamental concepts of attention mechanisms and the revolutionary Transformer architecture. Gain a deep understanding of multi-head attention, positional encodings, and layer normalization. Delve into optimizers and training strategies for large language models. Examine the LLaMa architecture and its significance in the field. This 1-hour 19-minute session, led by Graham Neubig, provides a thorough exploration of cutting-edge NLP technologies that form the backbone of modern language understanding and generation systems.

Syllabus

CMU Advanced NLP Fall 2024 (4): Attention and Transformers


Taught by

Graham Neubig

Related Courses

Artificial Intelligence Foundations: Neural Networks
LinkedIn Learning
Transformers: Text Classification for NLP Using BERT
LinkedIn Learning
TensorFlow: Working with NLP
LinkedIn Learning
BERTによる自然言語処理を学ぼう! -Attention、TransformerからBERTへとつながるNLP技術-
Udemy
Complete Natural Language Processing Tutorial in Python
Keith Galli via YouTube