YoVDO

Transformer and Large Language Models - Part II - Lecture 13

Offered By: MIT HAN Lab via YouTube

Tags

Transformers Courses Deep Learning Courses Neural Networks Courses Attention Mechanisms Courses Self-Attention Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into the second part of an in-depth lecture on Transformers and Large Language Models (LLMs) from MIT's 6.5940 course, EfficientML.ai. Led by Professor Song Han, this 1-hour and 17-minute Zoom session builds upon previous discussions, exploring advanced concepts and techniques in the field of natural language processing. Gain valuable insights into the latest developments in Transformer architectures and their applications in LLMs. Access accompanying slides at efficientml.ai to enhance your understanding of the material presented in this comprehensive graduate-level exploration of cutting-edge machine learning technologies.

Syllabus

EfficientML.ai Lecture 13 - Transformer and LLM (Part II) (MIT 6.5940, Fall 2023, Zoom)


Taught by

MIT HAN Lab

Related Courses

Transformers: Text Classification for NLP Using BERT
LinkedIn Learning
TensorFlow: Working with NLP
LinkedIn Learning
TransGAN - Two Transformers Can Make One Strong GAN - Machine Learning Research Paper Explained
Yannic Kilcher via YouTube
Nyströmformer- A Nyström-Based Algorithm for Approximating Self-Attention
Yannic Kilcher via YouTube
Recreate Google Translate - Model Training
Edan Meyer via YouTube