YoVDO

Transformer and Large Language Models - Part I - Lecture 12

Offered By: MIT HAN Lab via YouTube

Tags

Transformers Courses Machine Learning Courses Attention Mechanisms Courses Self-Attention Courses Positional Encoding Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into the world of Transformers and Large Language Models (LLMs) in this comprehensive lecture from MIT's 6.5940 course. Explore the fundamental concepts, architectures, and applications of these cutting-edge technologies in natural language processing and artificial intelligence. Learn from Professor Song Han as he delves into the intricacies of Transformer models and their impact on the development of powerful language models. Gain insights into the latest advancements in the field and understand how these models are revolutionizing various domains. Access accompanying slides for enhanced learning and visual aids to reinforce key concepts presented during this 80-minute session.

Syllabus

EfficientML.ai Lecture 12 - Transformer and LLM (Part I) (MIT 6.5940, Fall 2023, Zoom)


Taught by

MIT HAN Lab

Related Courses

NeRF - Representing Scenes as Neural Radiance Fields for View Synthesis
Yannic Kilcher via YouTube
Perceiver - General Perception with Iterative Attention
Yannic Kilcher via YouTube
LambdaNetworks- Modeling Long-Range Interactions Without Attention
Yannic Kilcher via YouTube
Attention Is All You Need - Transformer Paper Explained
Aleksa Gordić - The AI Epiphany via YouTube
NeRFs- Neural Radiance Fields - Paper Explained
Aladdin Persson via YouTube