YoVDO

Analyzing GPT-2's Brain Development

Offered By: Wolfram via YouTube

Tags

GPT-2 Courses Artificial Intelligence Courses Machine Learning Courses Neural Networks Courses Computational Linguistics Courses Wolfram Language Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intricacies of GPT-2's brain development in this 27-minute Wolfram Student Podcast episode featuring Shriya Ramanan's project. Delve into the effects of zero-ing out specific tokens, manipulating change nodes and their weights, and adjusting temperature parameters to gain a deeper understanding of the GPT-2 model's structure. Learn about generating tokens, examining nodes, and drawing parallels with the human brain. This informative discussion covers various aspects of AI and machine learning, providing insights into the inner workings of language models through the lens of computational analysis.

Syllabus

Intro
Project Summary
Generating Tokens
Nodes
Zeroing out weights
The human brain
Temperature parameters
Conclusion


Taught by

Wolfram

Related Courses

Generating New Recipes using GPT-2
Coursera Project Network via Coursera
Deep Learning NLP: Training GPT-2 from scratch
Coursera Project Network via Coursera
Artificial Creativity
Parsons School of Design via Coursera
Coding Train Late Night - GPT-2, Hue Lights, Discord Bot
Coding Train via YouTube
Coding Train Late Night - Fetch, GPT-2 and RunwayML
Coding Train via YouTube