YoVDO

Attention and Transformers in Advanced NLP - Lecture 4

Offered By: Graham Neubig via YouTube

Tags

Machine Learning Courses Deep Learning Courses Neural Networks Courses LLaMA (Large Language Model Meta AI) Courses Attention Mechanisms Courses Transformer Architecture Courses Positional Encoding Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into the intricacies of advanced natural language processing techniques in this comprehensive lecture from CMU's CS 11-711 course. Explore the fundamental concepts of attention mechanisms and the revolutionary Transformer architecture. Gain a deep understanding of multi-head attention, positional encodings, and layer normalization. Delve into optimizers and training strategies for large language models. Examine the LLaMa architecture and its significance in the field. This 1-hour 19-minute session, led by Graham Neubig, provides a thorough exploration of cutting-edge NLP technologies that form the backbone of modern language understanding and generation systems.

Syllabus

CMU Advanced NLP Fall 2024 (4): Attention and Transformers


Taught by

Graham Neubig

Related Courses

LLaMA- Open and Efficient Foundation Language Models - Paper Explained
Yannic Kilcher via YouTube
Alpaca & LLaMA - Can it Compete with ChatGPT?
Venelin Valkov via YouTube
Experimenting with Alpaca & LLaMA
Aladdin Persson via YouTube
What's LLaMA? ChatLLaMA? - And Some ChatGPT/InstructGPT
Aladdin Persson via YouTube
Llama Index - Step by Step Introduction
echohive via YouTube