YoVDO

Attention and Transformers in Advanced NLP - Lecture 4

Offered By: Graham Neubig via YouTube

Tags

Machine Learning Courses Deep Learning Courses Neural Networks Courses LLaMA (Large Language Model Meta AI) Courses Attention Mechanisms Courses Transformer Architecture Courses Positional Encoding Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into the intricacies of advanced natural language processing techniques in this comprehensive lecture from CMU's CS 11-711 course. Explore the fundamental concepts of attention mechanisms and the revolutionary Transformer architecture. Gain a deep understanding of multi-head attention, positional encodings, and layer normalization. Delve into optimizers and training strategies for large language models. Examine the LLaMa architecture and its significance in the field. This 1-hour 19-minute session, led by Graham Neubig, provides a thorough exploration of cutting-edge NLP technologies that form the backbone of modern language understanding and generation systems.

Syllabus

CMU Advanced NLP Fall 2024 (4): Attention and Transformers


Taught by

Graham Neubig

Related Courses

Deep Learning for Natural Language Processing
University of Oxford via Independent
Sequence Models
DeepLearning.AI via Coursera
Deep Learning Part 1 (IITM)
Indian Institute of Technology Madras via Swayam
Deep Learning - Part 1
Indian Institute of Technology, Ropar via Swayam
Deep Learning - IIT Ropar
Indian Institute of Technology, Ropar via Swayam