YoVDO

What Is ChatGPT Doing? Understanding Large Language Models - Episode 2

Offered By: Wolfram via YouTube

Tags

ChatGPT Courses Machine Learning Courses Neural Networks Courses Probability Distributions Courses Computational Linguistics Courses Wolfram Language Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the inner workings of large language models, particularly ChatGPT, in this 20-minute video from Wolfram. Delve into the process of how ChatGPT generates text one word at a time, understand the model's size and complexity, and examine different prompts and their effects. Learn about token storage, word probability distribution, sentence construction, and potential limitations of the model. Gain insights into the technical aspects of AI language processing through this informative discussion, which covers topics such as model architecture, token handling, and the challenges of maintaining coherence in longer outputs.

Syllabus

Intro
It’s Just Adding One Word at a Time
How Big Is the Model?
Let's Try a Different Prompt
Where Do the Tokens Get Stored?
What about the Other Word Probabilities?
How Can You Build Larger Sentences?
Why Does It Seem to Get Stuck?


Taught by

Wolfram

Related Courses

機率 (Probability)
National Taiwan University via Coursera
Einführung in die Wahrscheinlichkeitstheorie
Johannes Gutenberg University Mainz via iversity
Probabilidad básica
Universidad Politécnica de Cartagena via Miríadax
Probability And Statistics
Indian Institute of Technology, Kharagpur via Swayam
Modeling Risk and Realities
University of Pennsylvania via Coursera