YoVDO

Transformer and Large Language Models - Part I - Lecture 12

Offered By: MIT HAN Lab via YouTube

Tags

Transformers Courses Machine Learning Courses Attention Mechanisms Courses Self-Attention Courses Positional Encoding Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into the world of Transformers and Large Language Models (LLMs) with this comprehensive lecture from MIT's 6.5940 course. Explore the fundamental concepts, architectures, and applications of these groundbreaking technologies in natural language processing and beyond. Learn from Professor Song Han as he delves into the intricacies of attention mechanisms, self-attention, and the overall structure of Transformer models. Gain insights into the scaling laws of LLMs and understand their impact on various domains. Discover the challenges and opportunities in developing efficient and powerful language models. Access accompanying slides for visual aids and additional resources to enhance your understanding of this cutting-edge field in artificial intelligence and machine learning.

Syllabus

EfficientML.ai Lecture 12 - Transformer and LLM (Part I) (MIT 6.5940, Fall 2023)


Taught by

MIT HAN Lab

Related Courses

Introduction to Artificial Intelligence
Stanford University via Udacity
Natural Language Processing
Columbia University via Coursera
Probabilistic Graphical Models 1: Representation
Stanford University via Coursera
Computer Vision: The Fundamentals
University of California, Berkeley via Coursera
Learning from Data (Introductory Machine Learning course)
California Institute of Technology via Independent